期刊文献+
共找到80,413篇文章
< 1 2 250 >
每页显示 20 50 100
High-throughput screening of CO_(2) cycloaddition MOF catalyst with an explainable machine learning model
1
作者 Xuefeng Bai Yi Li +3 位作者 Yabo Xie Qiancheng Chen Xin Zhang Jian-Rong Li 《Green Energy & Environment》 SCIE EI CAS 2025年第1期132-138,共7页
The high porosity and tunable chemical functionality of metal-organic frameworks(MOFs)make it a promising catalyst design platform.High-throughput screening of catalytic performance is feasible since the large MOF str... The high porosity and tunable chemical functionality of metal-organic frameworks(MOFs)make it a promising catalyst design platform.High-throughput screening of catalytic performance is feasible since the large MOF structure database is available.In this study,we report a machine learning model for high-throughput screening of MOF catalysts for the CO_(2) cycloaddition reaction.The descriptors for model training were judiciously chosen according to the reaction mechanism,which leads to high accuracy up to 97%for the 75%quantile of the training set as the classification criterion.The feature contribution was further evaluated with SHAP and PDP analysis to provide a certain physical understanding.12,415 hypothetical MOF structures and 100 reported MOFs were evaluated under 100℃ and 1 bar within one day using the model,and 239 potentially efficient catalysts were discovered.Among them,MOF-76(Y)achieved the top performance experimentally among reported MOFs,in good agreement with the prediction. 展开更多
关键词 Metal-organic frameworks High-throughput screening machine learning Explainable model CO_(2)cycloaddition
在线阅读 下载PDF
Early identification of high-risk patients admitted to emergency departments using vital signs and machine learning
2
作者 Qingyuan Liu Yixin Zhang +10 位作者 Jian Sun Kaipeng Wang Yueguo Wang Yulan Wang Cailing Ren Yan Wang Jiashan Zhu Shusheng Zhou Mengping Zhang Yinglei Lai Kui Jin 《World Journal of Emergency Medicine》 2025年第2期113-120,共8页
BACKGROUND:Rapid and accurate identification of high-risk patients in the emergency departments(EDs)is crucial for optimizing resource allocation and improving patient outcomes.This study aimed to develop an early pre... BACKGROUND:Rapid and accurate identification of high-risk patients in the emergency departments(EDs)is crucial for optimizing resource allocation and improving patient outcomes.This study aimed to develop an early prediction model for identifying high-risk patients in EDs using initial vital sign measurements.METHODS:This retrospective cohort study analyzed initial vital signs from the Chinese Emergency Triage,Assessment,and Treatment(CETAT)database,which was collected between January 1^(st),2020,and June 25^(th),2023.The primary outcome was the identification of high-risk patients needing immediate treatment.Various machine learning methods,including a deep-learningbased multilayer perceptron(MLP)classifier were evaluated.Model performance was assessed using the area under the receiver operating characteristic curve(AUC-ROC).AUC-ROC values were reported for three scenarios:a default case,a scenario requiring sensitivity greater than 0.8(Scenario I),and a scenario requiring specificity greater than 0.8(Scenario II).SHAP values were calculated to determine the importance of each predictor within the MLP model.RESULTS:A total of 38,797 patients were analyzed,of whom 18.2%were identified as high-risk.Comparative analysis of the predictive models for high-risk patients showed AUC-ROC values ranging from 0.717 to 0.738,with the MLP model outperforming logistic regression(LR),Gaussian Naive Bayes(GNB),and the National Early Warning Score(NEWS).SHAP value analysis identified coma state,peripheral capillary oxygen saturation(SpO_(2)),and systolic blood pressure as the top three predictive factors in the MLP model,with coma state exerting the most contribution.CONCLUSION:Compared with other methods,the MLP model with initial vital signs demonstrated optimal prediction accuracy,highlighting its potential to enhance clinical decision-making in triage in the EDs. 展开更多
关键词 machine learning TRIAGE Emergency medicine Decision support systems
在线阅读 下载PDF
Machine learning empowers efficient design of ternary organic solar cells with PM6 donor
3
作者 Kiran A.Nirmal Tukaram D.Dongale +2 位作者 Santosh S.Sutar Atul C.Khot Tae Geun Kim 《Journal of Energy Chemistry》 2025年第1期337-347,共11页
Organic solar cells(OSCs) hold great potential as a photovoltaic technology for practical applications.However, the traditional experimental trial-and-error method for designing and engineering OSCs can be complex, ex... Organic solar cells(OSCs) hold great potential as a photovoltaic technology for practical applications.However, the traditional experimental trial-and-error method for designing and engineering OSCs can be complex, expensive, and time-consuming. Machine learning(ML) techniques enable the proficient extraction of information from datasets, allowing the development of realistic models that are capable of predicting the efficacy of materials with commendable accuracy. The PM6 donor has great potential for high-performance OSCs. However, it is crucial for the rational design of a ternary blend to accurately forecast the power conversion efficiency(PCE) of ternary OSCs(TOSCs) based on a PM6 donor.Accordingly, we collected the device parameters of PM6-based TOSCs and evaluated the feature importance of their molecule descriptors to develop predictive models. In this study, we used five different ML algorithms for analysis and prediction. For the analysis, the classification and regression tree provided different rules, heuristics, and patterns from the heterogeneous dataset. The random forest algorithm outperforms other prediction ML algorithms in predicting the output performance of PM6-based TOSCs. Finally, we validated the ML outcomes by fabricating PM6-based TOSCs. Our study presents a rapid strategy for assessing a high PCE while elucidating the substantial influence of diverse descriptors. 展开更多
关键词 machine learning Ternary organic solarcells PM6 donor PCE
在线阅读 下载PDF
Improving performance of screening MM/PBSA in protein–ligand interactions via machine learning
4
作者 Yuan-Qiang Chen Yao Xu +1 位作者 Yu-Qiang Ma Hong-Ming Ding 《Chinese Physics B》 2025年第1期486-496,共11页
Accurately estimating protein–ligand binding free energy is crucial for drug design and biophysics, yet remains a challenging task. In this study, we applied the screening molecular mechanics/Poisson–Boltzmann surfa... Accurately estimating protein–ligand binding free energy is crucial for drug design and biophysics, yet remains a challenging task. In this study, we applied the screening molecular mechanics/Poisson–Boltzmann surface area(MM/PBSA)method in combination with various machine learning techniques to compute the binding free energies of protein–ligand interactions. Our results demonstrate that machine learning outperforms direct screening MM/PBSA calculations in predicting protein–ligand binding free energies. Notably, the random forest(RF) method exhibited the best predictive performance,with a Pearson correlation coefficient(rp) of 0.702 and a mean absolute error(MAE) of 1.379 kcal/mol. Furthermore, we analyzed feature importance rankings in the gradient boosting(GB), adaptive boosting(Ada Boost), and RF methods, and found that feature selection significantly impacted predictive performance. In particular, molecular weight(MW) and van der Waals(VDW) energies played a decisive role in the prediction. Overall, this study highlights the potential of combining machine learning methods with screening MM/PBSA for accurately predicting binding free energies in biosystems. 展开更多
关键词 molecular mechanics/Poisson-Boltzmann surface area(MM/PBSA) binding free energy machine learning protein-ligand interaction
在线阅读 下载PDF
Evolution of pore systems in low-maturity oil shales during thermal upgrading--Quantified by dynamic SEM and machine learning 被引量:2
5
作者 Jun Liu Xue Bai Derek Elsworth 《Petroleum Science》 SCIE EI CAS CSCD 2024年第3期1739-1750,共12页
In-situ upgrading by heating is feasible for low-maturity shale oil,where the pore space dynamically evolves.We characterize this response for a heated substrate concurrently imaged by SEM.We systematically follow the... In-situ upgrading by heating is feasible for low-maturity shale oil,where the pore space dynamically evolves.We characterize this response for a heated substrate concurrently imaged by SEM.We systematically follow the evolution of pore quantity,size(length,width and cross-sectional area),orientation,shape(aspect ratio,roundness and solidity)and their anisotropy—interpreted by machine learning.Results indicate that heating generates new pores in both organic matter and inorganic minerals.However,the newly formed pores are smaller than the original pores and thus reduce average lengths and widths of the bedding-parallel pore system.Conversely,the average pore lengths and widths are increased in the bedding-perpendicular direction.Besides,heating increases the cross-sectional area of pores in low-maturity oil shales,where this growth tendency fluctuates at<300℃ but becomes steady at>300℃.In addition,the orientation and shape of the newly-formed heating-induced pores follow the habit of the original pores and follow the initial probability distributions of pore orientation and shape.Herein,limited anisotropy is detected in pore direction and shape,indicating similar modes of evolution both bedding-parallel and bedding-normal.We propose a straightforward but robust model to describe evolution of pore system in low-maturity oil shales during heating. 展开更多
关键词 Low-maturity oil shale Pore elongation Organic matter pyrolysis In-situthermal upgrading Scanning electron microscopy(SEM) machine learning
在线阅读 下载PDF
Machine learning for predicting the outcome of terminal ballistics events 被引量:2
6
作者 Shannon Ryan Neeraj Mohan Sushma +4 位作者 Arun Kumar AV Julian Berk Tahrima Hashem Santu Rana Svetha Venkatesh 《Defence Technology(防务技术)》 SCIE EI CAS CSCD 2024年第1期14-26,共13页
Machine learning(ML) is well suited for the prediction of high-complexity,high-dimensional problems such as those encountered in terminal ballistics.We evaluate the performance of four popular ML-based regression mode... Machine learning(ML) is well suited for the prediction of high-complexity,high-dimensional problems such as those encountered in terminal ballistics.We evaluate the performance of four popular ML-based regression models,extreme gradient boosting(XGBoost),artificial neural network(ANN),support vector regression(SVR),and Gaussian process regression(GP),on two common terminal ballistics’ problems:(a)predicting the V50ballistic limit of monolithic metallic armour impacted by small and medium calibre projectiles and fragments,and(b) predicting the depth to which a projectile will penetrate a target of semi-infinite thickness.To achieve this we utilise two datasets,each consisting of approximately 1000samples,collated from public release sources.We demonstrate that all four model types provide similarly excellent agreement when interpolating within the training data and diverge when extrapolating outside this range.Although extrapolation is not advisable for ML-based regression models,for applications such as lethality/survivability analysis,such capability is required.To circumvent this,we implement expert knowledge and physics-based models via enforced monotonicity,as a Gaussian prior mean,and through a modified loss function.The physics-informed models demonstrate improved performance over both classical physics-based models and the basic ML regression models,providing an ability to accurately fit experimental data when it is available and then revert to the physics-based model when not.The resulting models demonstrate high levels of predictive accuracy over a very wide range of projectile types,target materials and thicknesses,and impact conditions significantly more diverse than that achievable from any existing analytical approach.Compared with numerical analysis tools such as finite element solvers the ML models run orders of magnitude faster.We provide some general guidelines throughout for the development,application,and reporting of ML models in terminal ballistics problems. 展开更多
关键词 machine learning Artificial intelligence Physics-informed machine learning Terminal ballistics Armour
在线阅读 下载PDF
Machine learning applications on lunar meteorite minerals:From classification to mechanical properties prediction 被引量:1
7
作者 Eloy Peña-Asensio Josep M.Trigo-Rodríguez +2 位作者 Jordi Sort Jordi Ibáñez-Insa Albert Rimola 《International Journal of Mining Science and Technology》 SCIE EI CAS CSCD 2024年第9期1283-1292,共10页
Amid the scarcity of lunar meteorites and the imperative to preserve their scientific value,nondestructive testing methods are essential.This translates into the application of microscale rock mechanics experiments an... Amid the scarcity of lunar meteorites and the imperative to preserve their scientific value,nondestructive testing methods are essential.This translates into the application of microscale rock mechanics experiments and scanning electron microscopy for surface composition analysis.This study explores the application of Machine Learning algorithms in predicting the mineralogical and mechanical properties of DHOFAR 1084,JAH 838,and NWA 11444 lunar meteorites based solely on their atomic percentage compositions.Leveraging a prior-data fitted network model,we achieved near-perfect classification scores for meteorites,mineral groups,and individual minerals.The regressor models,notably the KNeighbor model,provided an outstanding estimate of the mechanical properties—previously measured by nanoindentation tests—such as hardness,reduced Young’s modulus,and elastic recovery.Further considerations on the nature and physical properties of the minerals forming these meteorites,including porosity,crystal orientation,or shock degree,are essential for refining predictions.Our findings underscore the potential of Machine Learning in enhancing mineral identification and mechanical property estimation in lunar exploration,which pave the way for new advancements and quick assessments in extraterrestrial mineral mining,processing,and research. 展开更多
关键词 METEORITES MOON MINERALOGY machine learning Mechanical properties
在线阅读 下载PDF
Machine learning in metal-ion battery research: Advancing material prediction, characterization, and status evaluation 被引量:1
8
作者 Tong Yu Chunyang Wang +1 位作者 Huicong Yang Feng Li 《Journal of Energy Chemistry》 SCIE EI CAS CSCD 2024年第3期191-204,I0006,共15页
Metal-ion batteries(MIBs),including alkali metal-ion(Li^(+),Na^(+),and K^(3)),multi-valent metal-ion(Zn^(2+),Mg^(2+),and Al^(3+)),metal-air,and metal-sulfur batteries,play an indispensable role in electrochemical ener... Metal-ion batteries(MIBs),including alkali metal-ion(Li^(+),Na^(+),and K^(3)),multi-valent metal-ion(Zn^(2+),Mg^(2+),and Al^(3+)),metal-air,and metal-sulfur batteries,play an indispensable role in electrochemical energy storage.However,the performance of MIBs is significantly influenced by numerous variables,resulting in multi-dimensional and long-term challenges in the field of battery research and performance enhancement.Machine learning(ML),with its capability to solve intricate tasks and perform robust data processing,is now catalyzing a revolutionary transformation in the development of MIB materials and devices.In this review,we summarize the utilization of ML algorithms that have expedited research on MIBs over the past five years.We present an extensive overview of existing algorithms,elucidating their details,advantages,and limitations in various applications,which encompass electrode screening,material property prediction,electrolyte formulation design,electrode material characterization,manufacturing parameter optimization,and real-time battery status monitoring.Finally,we propose potential solutions and future directions for the application of ML in advancing MIB development. 展开更多
关键词 Metal-ion battery machine learning Electrode materials CHARACTERIZATION Status evaluation
在线阅读 下载PDF
An Intelligent SDN-IoT Enabled Intrusion Detection System for Healthcare Systems Using a Hybrid Deep Learning and Machine Learning Approach 被引量:1
9
作者 R Arthi S Krishnaveni Sherali Zeadally 《China Communications》 SCIE CSCD 2024年第10期267-287,共21页
The advent of pandemics such as COVID-19 significantly impacts human behaviour and lives every day.Therefore,it is essential to make medical services connected to internet,available in every remote location during the... The advent of pandemics such as COVID-19 significantly impacts human behaviour and lives every day.Therefore,it is essential to make medical services connected to internet,available in every remote location during these situations.Also,the security issues in the Internet of Medical Things(IoMT)used in these service,make the situation even more critical because cyberattacks on the medical devices might cause treatment delays or clinical failures.Hence,services in the healthcare ecosystem need rapid,uninterrupted,and secure facilities.The solution provided in this research addresses security concerns and services availability for patients with critical health in remote areas.This research aims to develop an intelligent Software Defined Networks(SDNs)enabled secure framework for IoT healthcare ecosystem.We propose a hybrid of machine learning and deep learning techniques(DNN+SVM)to identify network intrusions in the sensor-based healthcare data.In addition,this system can efficiently monitor connected devices and suspicious behaviours.Finally,we evaluate the performance of our proposed framework using various performance metrics based on the healthcare application scenarios.the experimental results show that the proposed approach effectively detects and mitigates attacks in the SDN-enabled IoT networks and performs better that other state-of-art-approaches. 展开更多
关键词 deep neural network healthcare intrusion detection system IOT machine learning software-defined networks
在线阅读 下载PDF
Machine learning for membrane design and discovery 被引量:1
10
作者 Haoyu Yin Muzi Xu +4 位作者 Zhiyao Luo Xiaotian Bi Jiali Li Sui Zhang Xiaonan Wang 《Green Energy & Environment》 SCIE EI CAS CSCD 2024年第1期54-70,共17页
Membrane technologies are becoming increasingly versatile and helpful today for sustainable development.Machine Learning(ML),an essential branch of artificial intelligence(AI),has substantially impacted the research an... Membrane technologies are becoming increasingly versatile and helpful today for sustainable development.Machine Learning(ML),an essential branch of artificial intelligence(AI),has substantially impacted the research and development norm of new materials for energy and environment.This review provides an overview and perspectives on ML methodologies and their applications in membrane design and dis-covery.A brief overview of membrane technologies isfirst provided with the current bottlenecks and potential solutions.Through an appli-cations-based perspective of AI-aided membrane design and discovery,we further show how ML strategies are applied to the membrane discovery cycle(including membrane material design,membrane application,membrane process design,and knowledge extraction),in various membrane systems,ranging from gas,liquid,and fuel cell separation membranes.Furthermore,the best practices of integrating ML methods and specific application targets in membrane design and discovery are presented with an ideal paradigm proposed.The challenges to be addressed and prospects of AI applications in membrane discovery are also highlighted in the end. 展开更多
关键词 machine learning Membranes AI for Membrane DATA-DRIVEN DESIGN
在线阅读 下载PDF
Light-Activated Virtual Sensor Array with Machine Learning for Non-Invasive Diagnosis of Coronary Heart Disease 被引量:1
11
作者 Jiawang Hu Hao Qian +2 位作者 Sanyang Han Ping Zhang Yuan Lu 《Nano-Micro Letters》 SCIE EI CAS CSCD 2024年第12期427-448,共22页
Early non-invasive diagnosis of coronary heart disease(CHD)is critical.However,it is challenging to achieve accurate CHD diagnosis via detecting breath.In this work,heterostructured complexes of black phosphorus(BP)an... Early non-invasive diagnosis of coronary heart disease(CHD)is critical.However,it is challenging to achieve accurate CHD diagnosis via detecting breath.In this work,heterostructured complexes of black phosphorus(BP)and two-dimensional carbide and nitride(MXene)with high gas sensitivity and photo responsiveness were formulated using a self-assembly strategy.A light-activated virtual sensor array(LAVSA)based on BP/Ti_(3)C_(2)Tx was prepared under photomodulation and further assembled into an instant gas sensing platform(IGSP).In addition,a machine learning(ML)algorithm was introduced to help the IGSP detect and recognize the signals of breath samples to diagnose CHD.Due to the synergistic effect of BP and Ti_(3)C_(2)Tx as well as photo excitation,the synthesized heterostructured complexes exhibited higher performance than pristine Ti_(3)C_(2)Tx,with a response value 26%higher than that of pristine Ti_(3)C_(2)Tx.In addition,with the help of a pattern recognition algorithm,LAVSA successfully detected and identified 15 odor molecules affiliated with alcohols,ketones,aldehydes,esters,and acids.Meanwhile,with the assistance of ML,the IGSP achieved 69.2%accuracy in detecting the breath odor of 45 volunteers from healthy people and CHD patients.In conclusion,an immediate,low-cost,and accurate prototype was designed and fabricated for the noninvasive diagnosis of CHD,which provided a generalized solution for diagnosing other diseases and other more complex application scenarios. 展开更多
关键词 Black phosphorus/MXene heterostructures Light-activated virtual sensor array Diagnosis of coronary heart disease machine learning
在线阅读 下载PDF
Accurate and efficient remaining useful life prediction of batteries enabled by physics-informed machine learning 被引量:1
12
作者 Liang Ma Jinpeng Tian +2 位作者 Tieling Zhang Qinghua Guo Chunsheng Hu 《Journal of Energy Chemistry》 SCIE EI CAS CSCD 2024年第4期512-521,共10页
The safe and reliable operation of lithium-ion batteries necessitates the accurate prediction of remaining useful life(RUL).However,this task is challenging due to the diverse ageing mechanisms,various operating condi... The safe and reliable operation of lithium-ion batteries necessitates the accurate prediction of remaining useful life(RUL).However,this task is challenging due to the diverse ageing mechanisms,various operating conditions,and limited measured signals.Although data-driven methods are perceived as a promising solution,they ignore intrinsic battery physics,leading to compromised accuracy,low efficiency,and low interpretability.In response,this study integrates domain knowledge into deep learning to enhance the RUL prediction performance.We demonstrate accurate RUL prediction using only a single charging curve.First,a generalisable physics-based model is developed to extract ageing-correlated parameters that can describe and explain battery degradation from battery charging data.The parameters inform a deep neural network(DNN)to predict RUL with high accuracy and efficiency.The trained model is validated under 3 types of batteries working under 7 conditions,considering fully charged and partially charged cases.Using data from one cycle only,the proposed method achieves a root mean squared error(RMSE)of 11.42 cycles and a mean absolute relative error(MARE)of 3.19%on average,which are over45%and 44%lower compared to the two state-of-the-art data-driven methods,respectively.Besides its accuracy,the proposed method also outperforms existing methods in terms of efficiency,input burden,and robustness.The inherent relationship between the model parameters and the battery degradation mechanism is further revealed,substantiating the intrinsic superiority of the proposed method. 展开更多
关键词 Lithium-ion batteries Remaining useful life Physics-informed machine learning
在线阅读 下载PDF
Artificial Intelligence Meets Flexible Sensors:Emerging Smart Flexible Sensing Systems Driven by Machine Learning and Artificial Synapses 被引量:5
13
作者 Tianming Sun Bin Feng +8 位作者 Jinpeng Huo Yu Xiao Wengan Wang Jin Peng Zehua Li Chengjie Du Wenxian Wang Guisheng Zou Lei Liu 《Nano-Micro Letters》 SCIE EI CAS CSCD 2024年第1期235-273,共39页
The recent wave of the artificial intelligence(AI)revolution has aroused unprecedented interest in the intelligentialize of human society.As an essential component that bridges the physical world and digital signals,f... The recent wave of the artificial intelligence(AI)revolution has aroused unprecedented interest in the intelligentialize of human society.As an essential component that bridges the physical world and digital signals,flexible sensors are evolving from a single sensing element to a smarter system,which is capable of highly efficient acquisition,analysis,and even perception of vast,multifaceted data.While challenging from a manual perspective,the development of intelligent flexible sensing has been remarkably facilitated owing to the rapid advances of brain-inspired AI innovations from both the algorithm(machine learning)and the framework(artificial synapses)level.This review presents the recent progress of the emerging AI-driven,intelligent flexible sensing systems.The basic concept of machine learning and artificial synapses are introduced.The new enabling features induced by the fusion of AI and flexible sensing are comprehensively reviewed,which significantly advances the applications such as flexible sensory systems,soft/humanoid robotics,and human activity monitoring.As two of the most profound innovations in the twenty-first century,the deep incorporation of flexible sensing and AI technology holds tremendous potential for creating a smarter world for human beings. 展开更多
关键词 Flexible electronics Wearable electronics Neuromorphic MEMRISTOR Deep learning
在线阅读 下载PDF
Application of machine learning in perovskite materials and devices:A review
14
作者 Ming Chen Zhenhua Yin +6 位作者 Zhicheng Shan Xiaokai Zheng Lei Liu Zhonghua Dai Jun Zhang Shengzhong(Frank)Liu Zhuo Xu 《Journal of Energy Chemistry》 SCIE EI CAS CSCD 2024年第7期254-272,共19页
Metal-halide hybrid perovskite materials are excellent candidates for solar cells and photoelectric devices.In recent years,machine learning(ML)techniques have developed rapidly in many fields and provided ideas for m... Metal-halide hybrid perovskite materials are excellent candidates for solar cells and photoelectric devices.In recent years,machine learning(ML)techniques have developed rapidly in many fields and provided ideas for material discovery and design.ML can be applied to discover new materials quickly and effectively,with significant savings in resources and time compared with traditional experiments and density functional theory(DFT)calculations.In this review,we present the application of ML in per-ovskites and briefly review the recent works in the field of ML-assisted perovskite design.Firstly,the advantages of perovskites in solar cells and the merits of ML applied to perovskites are discussed.Secondly,the workflow of ML in perovskite design and some basic ML algorithms are introduced.Thirdly,the applications of ML in predicting various properties of perovskite materials and devices are reviewed.Finally,we propose some prospects for the future development of this field.The rapid devel-opment of ML technology will largely promote the process of materials science,and ML will become an increasingly popular method for predicting the target properties of materials and devices. 展开更多
关键词 machine learning PEROVSKITE Materials design Bandgap engineering Stability Crystal structure
在线阅读 下载PDF
Machine learning methods for predicting CO_(2) solubility in hydrocarbons
15
作者 Yi Yang Binshan Ju +1 位作者 Guangzhong Lü Yingsong Huang 《Petroleum Science》 SCIE EI CAS CSCD 2024年第5期3340-3349,共10页
The application of carbon dioxide(CO_(2)) in enhanced oil recovery(EOR) has increased significantly, in which CO_(2) solubility in oil is a key parameter in predicting CO_(2) flooding performance. Hydrocarbons are the... The application of carbon dioxide(CO_(2)) in enhanced oil recovery(EOR) has increased significantly, in which CO_(2) solubility in oil is a key parameter in predicting CO_(2) flooding performance. Hydrocarbons are the major constituents of oil, thus the focus of this work lies in investigating the solubility of CO_(2) in hydrocarbons. However, current experimental measurements are time-consuming, and equations of state can be computationally complex. To address these challenges, we developed an artificial intelligence-based model to predict the solubility of CO_(2) in hydrocarbons under varying conditions of temperature, pressure, molecular weight, and density. Using experimental data from previous studies,we trained and predicted the solubility using four machine learning models: support vector regression(SVR), extreme gradient boosting(XGBoost), random forest(RF), and multilayer perceptron(MLP).Among four models, the XGBoost model has the best predictive performance, with an R^(2) of 0.9838.Additionally, sensitivity analysis and evaluation of the relative impacts of each input parameter indicate that the prediction of CO_(2) solubility in hydrocarbons is most sensitive to pressure. Furthermore, our trained model was compared with existing models, demonstrating higher accuracy and applicability of our model. The developed machine learning-based model provides a more efficient and accurate approach for predicting CO_(2) solubility in hydrocarbons, which may contribute to the advancement of CO_(2)-related applications in the petroleum industry. 展开更多
关键词 CO_(2)solubility machine learning Support vector regression Extreme gradient boosting Random forest Multi-layer perceptron
在线阅读 下载PDF
Navigating challenges and opportunities of machine learning in hydrogen catalysis and production processes: Beyond algorithm development
16
作者 Mohd Nur Ikhmal Salehmin Sieh Kiong Tiong +5 位作者 Hassan Mohamed Dallatu Abbas Umar Kai Ling Yu Hwai Chyuan Ong Saifuddin Nomanbhay Swee Su Lim 《Journal of Energy Chemistry》 SCIE EI CAS CSCD 2024年第12期223-252,共30页
With the projected global surge in hydrogen demand, driven by increasing applications and the imperative for low-emission hydrogen, the integration of machine learning(ML) across the hydrogen energy value chain is a c... With the projected global surge in hydrogen demand, driven by increasing applications and the imperative for low-emission hydrogen, the integration of machine learning(ML) across the hydrogen energy value chain is a compelling avenue. This review uniquely focuses on harnessing the synergy between ML and computational modeling(CM) or optimization tools, as well as integrating multiple ML techniques with CM, for the synthesis of diverse hydrogen evolution reaction(HER) catalysts and various hydrogen production processes(HPPs). Furthermore, this review addresses a notable gap in the literature by offering insights, analyzing challenges, and identifying research prospects and opportunities for sustainable hydrogen production. While the literature reflects a promising landscape for ML applications in hydrogen energy domains, transitioning AI-based algorithms from controlled environments to real-world applications poses significant challenges. Hence, this comprehensive review delves into the technical,practical, and ethical considerations associated with the application of ML in HER catalyst development and HPP optimization. Overall, this review provides guidance for unlocking the transformative potential of ML in enhancing prediction efficiency and sustainability in the hydrogen production sector. 展开更多
关键词 machine learning Computational modeling HER catalyst synthesis Hydrogen energy Hydrogen production processes Algorithm development
在线阅读 下载PDF
Ensemble prediction modeling of flotation recovery based on machine learning
17
作者 Guichun He Mengfei Liu +1 位作者 Hongyu Zhao Kaiqi Huang 《International Journal of Mining Science and Technology》 SCIE EI CAS CSCD 2024年第12期1727-1740,共14页
With the rise of artificial intelligence(AI)in mineral processing,predicting the flotation indexes has attracted significant research attention.Nevertheless,current prediction models suffer from low accuracy and high ... With the rise of artificial intelligence(AI)in mineral processing,predicting the flotation indexes has attracted significant research attention.Nevertheless,current prediction models suffer from low accuracy and high prediction errors.Therefore,this paper utilizes a two-step procedure.First,the outliers are pro-cessed using the box chart method and filtering algorithm.Then,the decision tree(DT),support vector regression(SVR),random forest(RF),and the bagging,boosting,and stacking integration algorithms are employed to construct a flotation recovery prediction model.Extensive experiments compared the prediction accuracy of six modeling methods on flotation recovery and delved into the impact of diverse base model combinations on the stacking model’s prediction accuracy.In addition,field data have veri-fied the model’s effectiveness.This study demonstrates that the stacking ensemble approaches,which uses ten variables to predict flotation recovery,yields a more favorable prediction effect than the bagging ensemble approach and single models,achieving MAE,RMSE,R2,and MRE scores of 0.929,1.370,0.843,and 1.229%,respectively.The hit rates,within an error range of±2%and±4%,are 82.4%and 94.6%.Consequently,the prediction effect is relatively precise and offers significant value in the context of actual production. 展开更多
关键词 machine learning STACKING BAGGING Flotation recovery rate Filtering algorithm
在线阅读 下载PDF
Machine learning for carbonate formation drilling: Mud loss prediction using seismic attributes and mud loss records
18
作者 Hui-Wen Pang Han-Qing Wang +4 位作者 Yi-Tian Xiao Yan Jin Yun-Hu Lu Yong-Dong Fan Zhen Nie 《Petroleum Science》 SCIE EI CAS CSCD 2024年第2期1241-1256,共16页
Due to the complexity and variability of carbonate formation leakage zones, lost circulation prediction and control is one of the major challenges of carbonate drilling. It raises well-control risks and production exp... Due to the complexity and variability of carbonate formation leakage zones, lost circulation prediction and control is one of the major challenges of carbonate drilling. It raises well-control risks and production expenses. This research utilizes the H oilfield as an example, employs seismic features to analyze mud loss prediction, and produces a complete set of pre-drilling mud loss prediction solutions. Firstly, 16seismic attributes are calculated based on the post-stack seismic data, and the mud loss rate per unit footage is specified. The sample set is constructed by extracting each attribute from the seismic trace surrounding 15 typical wells, with a ratio of 8:2 between the training set and the test set. With the calibration results for mud loss rate per unit footage, the nonlinear mapping relationship between seismic attributes and mud loss rate per unit size is established using the mixed density network model.Then, the influence of the number of sub-Gausses and the uncertainty coefficient on the model's prediction is evaluated. Finally, the model is used in conjunction with downhole drilling conditions to assess the risk of mud loss in various layers and along the wellbore trajectory. The study demonstrates that the mean relative errors of the model for training data and test data are 6.9% and 7.5%, respectively, and that R2is 90% and 88%, respectively, for training data and test data. The accuracy and efficacy of mud loss prediction may be greatly enhanced by combining 16 seismic attributes with the mud loss rate per unit footage and applying machine learning methods. The mud loss prediction model based on the MDN model can not only predict the mud loss rate but also objectively evaluate the prediction based on the quality of the data and the model. 展开更多
关键词 Lost circulation Risk prediction machine learning Seismic attributes Mud loss records
在线阅读 下载PDF
A hybrid machine learning optimization algorithm for multivariable pore pressure prediction
19
作者 Song Deng Hao-Yu Pan +8 位作者 Hai-Ge Wang Shou-Kun Xu Xiao-Peng Yan Chao-Wei Li Ming-Guo Peng Hao-Ping Peng Lin Shi Meng Cui Fei Zhao 《Petroleum Science》 SCIE EI CAS CSCD 2024年第1期535-550,共16页
Pore pressure is essential data in drilling design,and its accurate prediction is necessary to ensure drilling safety and improve drilling efficiency.Traditional methods for predicting pore pressure are limited when f... Pore pressure is essential data in drilling design,and its accurate prediction is necessary to ensure drilling safety and improve drilling efficiency.Traditional methods for predicting pore pressure are limited when forming particular structures and lithology.In this paper,a machine learning algorithm and effective stress theorem are used to establish the transformation model between rock physical parameters and pore pressure.This study collects data from three wells.Well 1 had 881 data sets for model training,and Wells 2 and 3 had 538 and 464 data sets for model testing.In this paper,support vector machine(SVM),random forest(RF),extreme gradient boosting(XGB),and multilayer perceptron(MLP)are selected as the machine learning algorithms for pore pressure modeling.In addition,this paper uses the grey wolf optimization(GWO)algorithm,particle swarm optimization(PSO)algorithm,sparrow search algorithm(SSA),and bat algorithm(BA)to establish a hybrid machine learning optimization algorithm,and proposes an improved grey wolf optimization(IGWO)algorithm.The IGWO-MLP model obtained the minimum root mean square error(RMSE)by using the 5-fold cross-validation method for the training data.For the pore pressure data in Well 2 and Well 3,the coefficients of determination(R^(2))of SVM,RF,XGB,and MLP are 0.9930 and 0.9446,0.9943 and 0.9472,0.9945 and 0.9488,0.9949 and 0.9574.MLP achieves optimal performance on both training and test data,and the MLP model shows a high degree of generalization.It indicates that the IGWO-MLP is an excellent predictor of pore pressure and can be used to predict pore pressure. 展开更多
关键词 Pore pressure Grey wolf optimization Multilayer perceptron Effective stress machine learning
在线阅读 下载PDF
Unlocking the potential of unlabeled data:Self-supervised machine learning for battery aging diagnosis with real-world field data
20
作者 Qiao Wang Min Ye +4 位作者 Sehriban Celik Zhongwei Deng Bin Li Dirk Uwe Sauer Weihan Li 《Journal of Energy Chemistry》 SCIE EI CAS CSCD 2024年第12期681-691,共11页
Accurate aging diagnosis is crucial for the health and safety management of lithium-ion batteries in electric vehicles.Despite significant advancements achieved by data-driven methods,diagnosis accuracy remains constr... Accurate aging diagnosis is crucial for the health and safety management of lithium-ion batteries in electric vehicles.Despite significant advancements achieved by data-driven methods,diagnosis accuracy remains constrained by the high costs of check-up tests and the scarcity of labeled data.This paper presents a framework utilizing self-supervised machine learning to harness the potential of unlabeled data for diagnosing battery aging in electric vehicles during field operations.We validate our method using battery degradation datasets collected over more than two years from twenty real-world electric vehicles.Our analysis comprehensively addresses cell inconsistencies,physical interpretations,and charging uncertainties in real-world applications.This is achieved through self-supervised feature extraction using random short charging sequences in the main peak of incremental capacity curves.By leveraging inexpensive unlabeled data in a self-supervised approach,our method demonstrates improvements in average root mean square errors of 74.54%and 60.50%in the best and worst cases,respectively,compared to the supervised benchmark.This work underscores the potential of employing low-cost unlabeled data with self-supervised machine learning for effective battery health and safety management in realworld scenarios. 展开更多
关键词 Lithium-ion battery Aging diagnosis Self-supervised machine learning Unlabeled data
在线阅读 下载PDF
上一页 1 2 250 下一页 到第
使用帮助 返回顶部