The high porosity and tunable chemical functionality of metal-organic frameworks(MOFs)make it a promising catalyst design platform.High-throughput screening of catalytic performance is feasible since the large MOF str...The high porosity and tunable chemical functionality of metal-organic frameworks(MOFs)make it a promising catalyst design platform.High-throughput screening of catalytic performance is feasible since the large MOF structure database is available.In this study,we report a machine learning model for high-throughput screening of MOF catalysts for the CO_(2) cycloaddition reaction.The descriptors for model training were judiciously chosen according to the reaction mechanism,which leads to high accuracy up to 97%for the 75%quantile of the training set as the classification criterion.The feature contribution was further evaluated with SHAP and PDP analysis to provide a certain physical understanding.12,415 hypothetical MOF structures and 100 reported MOFs were evaluated under 100℃ and 1 bar within one day using the model,and 239 potentially efficient catalysts were discovered.Among them,MOF-76(Y)achieved the top performance experimentally among reported MOFs,in good agreement with the prediction.展开更多
BACKGROUND:Rapid and accurate identification of high-risk patients in the emergency departments(EDs)is crucial for optimizing resource allocation and improving patient outcomes.This study aimed to develop an early pre...BACKGROUND:Rapid and accurate identification of high-risk patients in the emergency departments(EDs)is crucial for optimizing resource allocation and improving patient outcomes.This study aimed to develop an early prediction model for identifying high-risk patients in EDs using initial vital sign measurements.METHODS:This retrospective cohort study analyzed initial vital signs from the Chinese Emergency Triage,Assessment,and Treatment(CETAT)database,which was collected between January 1^(st),2020,and June 25^(th),2023.The primary outcome was the identification of high-risk patients needing immediate treatment.Various machine learning methods,including a deep-learningbased multilayer perceptron(MLP)classifier were evaluated.Model performance was assessed using the area under the receiver operating characteristic curve(AUC-ROC).AUC-ROC values were reported for three scenarios:a default case,a scenario requiring sensitivity greater than 0.8(Scenario I),and a scenario requiring specificity greater than 0.8(Scenario II).SHAP values were calculated to determine the importance of each predictor within the MLP model.RESULTS:A total of 38,797 patients were analyzed,of whom 18.2%were identified as high-risk.Comparative analysis of the predictive models for high-risk patients showed AUC-ROC values ranging from 0.717 to 0.738,with the MLP model outperforming logistic regression(LR),Gaussian Naive Bayes(GNB),and the National Early Warning Score(NEWS).SHAP value analysis identified coma state,peripheral capillary oxygen saturation(SpO_(2)),and systolic blood pressure as the top three predictive factors in the MLP model,with coma state exerting the most contribution.CONCLUSION:Compared with other methods,the MLP model with initial vital signs demonstrated optimal prediction accuracy,highlighting its potential to enhance clinical decision-making in triage in the EDs.展开更多
Organic solar cells(OSCs) hold great potential as a photovoltaic technology for practical applications.However, the traditional experimental trial-and-error method for designing and engineering OSCs can be complex, ex...Organic solar cells(OSCs) hold great potential as a photovoltaic technology for practical applications.However, the traditional experimental trial-and-error method for designing and engineering OSCs can be complex, expensive, and time-consuming. Machine learning(ML) techniques enable the proficient extraction of information from datasets, allowing the development of realistic models that are capable of predicting the efficacy of materials with commendable accuracy. The PM6 donor has great potential for high-performance OSCs. However, it is crucial for the rational design of a ternary blend to accurately forecast the power conversion efficiency(PCE) of ternary OSCs(TOSCs) based on a PM6 donor.Accordingly, we collected the device parameters of PM6-based TOSCs and evaluated the feature importance of their molecule descriptors to develop predictive models. In this study, we used five different ML algorithms for analysis and prediction. For the analysis, the classification and regression tree provided different rules, heuristics, and patterns from the heterogeneous dataset. The random forest algorithm outperforms other prediction ML algorithms in predicting the output performance of PM6-based TOSCs. Finally, we validated the ML outcomes by fabricating PM6-based TOSCs. Our study presents a rapid strategy for assessing a high PCE while elucidating the substantial influence of diverse descriptors.展开更多
Accurately estimating protein–ligand binding free energy is crucial for drug design and biophysics, yet remains a challenging task. In this study, we applied the screening molecular mechanics/Poisson–Boltzmann surfa...Accurately estimating protein–ligand binding free energy is crucial for drug design and biophysics, yet remains a challenging task. In this study, we applied the screening molecular mechanics/Poisson–Boltzmann surface area(MM/PBSA)method in combination with various machine learning techniques to compute the binding free energies of protein–ligand interactions. Our results demonstrate that machine learning outperforms direct screening MM/PBSA calculations in predicting protein–ligand binding free energies. Notably, the random forest(RF) method exhibited the best predictive performance,with a Pearson correlation coefficient(rp) of 0.702 and a mean absolute error(MAE) of 1.379 kcal/mol. Furthermore, we analyzed feature importance rankings in the gradient boosting(GB), adaptive boosting(Ada Boost), and RF methods, and found that feature selection significantly impacted predictive performance. In particular, molecular weight(MW) and van der Waals(VDW) energies played a decisive role in the prediction. Overall, this study highlights the potential of combining machine learning methods with screening MM/PBSA for accurately predicting binding free energies in biosystems.展开更多
In-situ upgrading by heating is feasible for low-maturity shale oil,where the pore space dynamically evolves.We characterize this response for a heated substrate concurrently imaged by SEM.We systematically follow the...In-situ upgrading by heating is feasible for low-maturity shale oil,where the pore space dynamically evolves.We characterize this response for a heated substrate concurrently imaged by SEM.We systematically follow the evolution of pore quantity,size(length,width and cross-sectional area),orientation,shape(aspect ratio,roundness and solidity)and their anisotropy—interpreted by machine learning.Results indicate that heating generates new pores in both organic matter and inorganic minerals.However,the newly formed pores are smaller than the original pores and thus reduce average lengths and widths of the bedding-parallel pore system.Conversely,the average pore lengths and widths are increased in the bedding-perpendicular direction.Besides,heating increases the cross-sectional area of pores in low-maturity oil shales,where this growth tendency fluctuates at<300℃ but becomes steady at>300℃.In addition,the orientation and shape of the newly-formed heating-induced pores follow the habit of the original pores and follow the initial probability distributions of pore orientation and shape.Herein,limited anisotropy is detected in pore direction and shape,indicating similar modes of evolution both bedding-parallel and bedding-normal.We propose a straightforward but robust model to describe evolution of pore system in low-maturity oil shales during heating.展开更多
Machine learning(ML) is well suited for the prediction of high-complexity,high-dimensional problems such as those encountered in terminal ballistics.We evaluate the performance of four popular ML-based regression mode...Machine learning(ML) is well suited for the prediction of high-complexity,high-dimensional problems such as those encountered in terminal ballistics.We evaluate the performance of four popular ML-based regression models,extreme gradient boosting(XGBoost),artificial neural network(ANN),support vector regression(SVR),and Gaussian process regression(GP),on two common terminal ballistics’ problems:(a)predicting the V50ballistic limit of monolithic metallic armour impacted by small and medium calibre projectiles and fragments,and(b) predicting the depth to which a projectile will penetrate a target of semi-infinite thickness.To achieve this we utilise two datasets,each consisting of approximately 1000samples,collated from public release sources.We demonstrate that all four model types provide similarly excellent agreement when interpolating within the training data and diverge when extrapolating outside this range.Although extrapolation is not advisable for ML-based regression models,for applications such as lethality/survivability analysis,such capability is required.To circumvent this,we implement expert knowledge and physics-based models via enforced monotonicity,as a Gaussian prior mean,and through a modified loss function.The physics-informed models demonstrate improved performance over both classical physics-based models and the basic ML regression models,providing an ability to accurately fit experimental data when it is available and then revert to the physics-based model when not.The resulting models demonstrate high levels of predictive accuracy over a very wide range of projectile types,target materials and thicknesses,and impact conditions significantly more diverse than that achievable from any existing analytical approach.Compared with numerical analysis tools such as finite element solvers the ML models run orders of magnitude faster.We provide some general guidelines throughout for the development,application,and reporting of ML models in terminal ballistics problems.展开更多
Amid the scarcity of lunar meteorites and the imperative to preserve their scientific value,nondestructive testing methods are essential.This translates into the application of microscale rock mechanics experiments an...Amid the scarcity of lunar meteorites and the imperative to preserve their scientific value,nondestructive testing methods are essential.This translates into the application of microscale rock mechanics experiments and scanning electron microscopy for surface composition analysis.This study explores the application of Machine Learning algorithms in predicting the mineralogical and mechanical properties of DHOFAR 1084,JAH 838,and NWA 11444 lunar meteorites based solely on their atomic percentage compositions.Leveraging a prior-data fitted network model,we achieved near-perfect classification scores for meteorites,mineral groups,and individual minerals.The regressor models,notably the KNeighbor model,provided an outstanding estimate of the mechanical properties—previously measured by nanoindentation tests—such as hardness,reduced Young’s modulus,and elastic recovery.Further considerations on the nature and physical properties of the minerals forming these meteorites,including porosity,crystal orientation,or shock degree,are essential for refining predictions.Our findings underscore the potential of Machine Learning in enhancing mineral identification and mechanical property estimation in lunar exploration,which pave the way for new advancements and quick assessments in extraterrestrial mineral mining,processing,and research.展开更多
Metal-ion batteries(MIBs),including alkali metal-ion(Li^(+),Na^(+),and K^(3)),multi-valent metal-ion(Zn^(2+),Mg^(2+),and Al^(3+)),metal-air,and metal-sulfur batteries,play an indispensable role in electrochemical ener...Metal-ion batteries(MIBs),including alkali metal-ion(Li^(+),Na^(+),and K^(3)),multi-valent metal-ion(Zn^(2+),Mg^(2+),and Al^(3+)),metal-air,and metal-sulfur batteries,play an indispensable role in electrochemical energy storage.However,the performance of MIBs is significantly influenced by numerous variables,resulting in multi-dimensional and long-term challenges in the field of battery research and performance enhancement.Machine learning(ML),with its capability to solve intricate tasks and perform robust data processing,is now catalyzing a revolutionary transformation in the development of MIB materials and devices.In this review,we summarize the utilization of ML algorithms that have expedited research on MIBs over the past five years.We present an extensive overview of existing algorithms,elucidating their details,advantages,and limitations in various applications,which encompass electrode screening,material property prediction,electrolyte formulation design,electrode material characterization,manufacturing parameter optimization,and real-time battery status monitoring.Finally,we propose potential solutions and future directions for the application of ML in advancing MIB development.展开更多
The advent of pandemics such as COVID-19 significantly impacts human behaviour and lives every day.Therefore,it is essential to make medical services connected to internet,available in every remote location during the...The advent of pandemics such as COVID-19 significantly impacts human behaviour and lives every day.Therefore,it is essential to make medical services connected to internet,available in every remote location during these situations.Also,the security issues in the Internet of Medical Things(IoMT)used in these service,make the situation even more critical because cyberattacks on the medical devices might cause treatment delays or clinical failures.Hence,services in the healthcare ecosystem need rapid,uninterrupted,and secure facilities.The solution provided in this research addresses security concerns and services availability for patients with critical health in remote areas.This research aims to develop an intelligent Software Defined Networks(SDNs)enabled secure framework for IoT healthcare ecosystem.We propose a hybrid of machine learning and deep learning techniques(DNN+SVM)to identify network intrusions in the sensor-based healthcare data.In addition,this system can efficiently monitor connected devices and suspicious behaviours.Finally,we evaluate the performance of our proposed framework using various performance metrics based on the healthcare application scenarios.the experimental results show that the proposed approach effectively detects and mitigates attacks in the SDN-enabled IoT networks and performs better that other state-of-art-approaches.展开更多
Membrane technologies are becoming increasingly versatile and helpful today for sustainable development.Machine Learning(ML),an essential branch of artificial intelligence(AI),has substantially impacted the research an...Membrane technologies are becoming increasingly versatile and helpful today for sustainable development.Machine Learning(ML),an essential branch of artificial intelligence(AI),has substantially impacted the research and development norm of new materials for energy and environment.This review provides an overview and perspectives on ML methodologies and their applications in membrane design and dis-covery.A brief overview of membrane technologies isfirst provided with the current bottlenecks and potential solutions.Through an appli-cations-based perspective of AI-aided membrane design and discovery,we further show how ML strategies are applied to the membrane discovery cycle(including membrane material design,membrane application,membrane process design,and knowledge extraction),in various membrane systems,ranging from gas,liquid,and fuel cell separation membranes.Furthermore,the best practices of integrating ML methods and specific application targets in membrane design and discovery are presented with an ideal paradigm proposed.The challenges to be addressed and prospects of AI applications in membrane discovery are also highlighted in the end.展开更多
Early non-invasive diagnosis of coronary heart disease(CHD)is critical.However,it is challenging to achieve accurate CHD diagnosis via detecting breath.In this work,heterostructured complexes of black phosphorus(BP)an...Early non-invasive diagnosis of coronary heart disease(CHD)is critical.However,it is challenging to achieve accurate CHD diagnosis via detecting breath.In this work,heterostructured complexes of black phosphorus(BP)and two-dimensional carbide and nitride(MXene)with high gas sensitivity and photo responsiveness were formulated using a self-assembly strategy.A light-activated virtual sensor array(LAVSA)based on BP/Ti_(3)C_(2)Tx was prepared under photomodulation and further assembled into an instant gas sensing platform(IGSP).In addition,a machine learning(ML)algorithm was introduced to help the IGSP detect and recognize the signals of breath samples to diagnose CHD.Due to the synergistic effect of BP and Ti_(3)C_(2)Tx as well as photo excitation,the synthesized heterostructured complexes exhibited higher performance than pristine Ti_(3)C_(2)Tx,with a response value 26%higher than that of pristine Ti_(3)C_(2)Tx.In addition,with the help of a pattern recognition algorithm,LAVSA successfully detected and identified 15 odor molecules affiliated with alcohols,ketones,aldehydes,esters,and acids.Meanwhile,with the assistance of ML,the IGSP achieved 69.2%accuracy in detecting the breath odor of 45 volunteers from healthy people and CHD patients.In conclusion,an immediate,low-cost,and accurate prototype was designed and fabricated for the noninvasive diagnosis of CHD,which provided a generalized solution for diagnosing other diseases and other more complex application scenarios.展开更多
The safe and reliable operation of lithium-ion batteries necessitates the accurate prediction of remaining useful life(RUL).However,this task is challenging due to the diverse ageing mechanisms,various operating condi...The safe and reliable operation of lithium-ion batteries necessitates the accurate prediction of remaining useful life(RUL).However,this task is challenging due to the diverse ageing mechanisms,various operating conditions,and limited measured signals.Although data-driven methods are perceived as a promising solution,they ignore intrinsic battery physics,leading to compromised accuracy,low efficiency,and low interpretability.In response,this study integrates domain knowledge into deep learning to enhance the RUL prediction performance.We demonstrate accurate RUL prediction using only a single charging curve.First,a generalisable physics-based model is developed to extract ageing-correlated parameters that can describe and explain battery degradation from battery charging data.The parameters inform a deep neural network(DNN)to predict RUL with high accuracy and efficiency.The trained model is validated under 3 types of batteries working under 7 conditions,considering fully charged and partially charged cases.Using data from one cycle only,the proposed method achieves a root mean squared error(RMSE)of 11.42 cycles and a mean absolute relative error(MARE)of 3.19%on average,which are over45%and 44%lower compared to the two state-of-the-art data-driven methods,respectively.Besides its accuracy,the proposed method also outperforms existing methods in terms of efficiency,input burden,and robustness.The inherent relationship between the model parameters and the battery degradation mechanism is further revealed,substantiating the intrinsic superiority of the proposed method.展开更多
The recent wave of the artificial intelligence(AI)revolution has aroused unprecedented interest in the intelligentialize of human society.As an essential component that bridges the physical world and digital signals,f...The recent wave of the artificial intelligence(AI)revolution has aroused unprecedented interest in the intelligentialize of human society.As an essential component that bridges the physical world and digital signals,flexible sensors are evolving from a single sensing element to a smarter system,which is capable of highly efficient acquisition,analysis,and even perception of vast,multifaceted data.While challenging from a manual perspective,the development of intelligent flexible sensing has been remarkably facilitated owing to the rapid advances of brain-inspired AI innovations from both the algorithm(machine learning)and the framework(artificial synapses)level.This review presents the recent progress of the emerging AI-driven,intelligent flexible sensing systems.The basic concept of machine learning and artificial synapses are introduced.The new enabling features induced by the fusion of AI and flexible sensing are comprehensively reviewed,which significantly advances the applications such as flexible sensory systems,soft/humanoid robotics,and human activity monitoring.As two of the most profound innovations in the twenty-first century,the deep incorporation of flexible sensing and AI technology holds tremendous potential for creating a smarter world for human beings.展开更多
Metal-halide hybrid perovskite materials are excellent candidates for solar cells and photoelectric devices.In recent years,machine learning(ML)techniques have developed rapidly in many fields and provided ideas for m...Metal-halide hybrid perovskite materials are excellent candidates for solar cells and photoelectric devices.In recent years,machine learning(ML)techniques have developed rapidly in many fields and provided ideas for material discovery and design.ML can be applied to discover new materials quickly and effectively,with significant savings in resources and time compared with traditional experiments and density functional theory(DFT)calculations.In this review,we present the application of ML in per-ovskites and briefly review the recent works in the field of ML-assisted perovskite design.Firstly,the advantages of perovskites in solar cells and the merits of ML applied to perovskites are discussed.Secondly,the workflow of ML in perovskite design and some basic ML algorithms are introduced.Thirdly,the applications of ML in predicting various properties of perovskite materials and devices are reviewed.Finally,we propose some prospects for the future development of this field.The rapid devel-opment of ML technology will largely promote the process of materials science,and ML will become an increasingly popular method for predicting the target properties of materials and devices.展开更多
The application of carbon dioxide(CO_(2)) in enhanced oil recovery(EOR) has increased significantly, in which CO_(2) solubility in oil is a key parameter in predicting CO_(2) flooding performance. Hydrocarbons are the...The application of carbon dioxide(CO_(2)) in enhanced oil recovery(EOR) has increased significantly, in which CO_(2) solubility in oil is a key parameter in predicting CO_(2) flooding performance. Hydrocarbons are the major constituents of oil, thus the focus of this work lies in investigating the solubility of CO_(2) in hydrocarbons. However, current experimental measurements are time-consuming, and equations of state can be computationally complex. To address these challenges, we developed an artificial intelligence-based model to predict the solubility of CO_(2) in hydrocarbons under varying conditions of temperature, pressure, molecular weight, and density. Using experimental data from previous studies,we trained and predicted the solubility using four machine learning models: support vector regression(SVR), extreme gradient boosting(XGBoost), random forest(RF), and multilayer perceptron(MLP).Among four models, the XGBoost model has the best predictive performance, with an R^(2) of 0.9838.Additionally, sensitivity analysis and evaluation of the relative impacts of each input parameter indicate that the prediction of CO_(2) solubility in hydrocarbons is most sensitive to pressure. Furthermore, our trained model was compared with existing models, demonstrating higher accuracy and applicability of our model. The developed machine learning-based model provides a more efficient and accurate approach for predicting CO_(2) solubility in hydrocarbons, which may contribute to the advancement of CO_(2)-related applications in the petroleum industry.展开更多
With the projected global surge in hydrogen demand, driven by increasing applications and the imperative for low-emission hydrogen, the integration of machine learning(ML) across the hydrogen energy value chain is a c...With the projected global surge in hydrogen demand, driven by increasing applications and the imperative for low-emission hydrogen, the integration of machine learning(ML) across the hydrogen energy value chain is a compelling avenue. This review uniquely focuses on harnessing the synergy between ML and computational modeling(CM) or optimization tools, as well as integrating multiple ML techniques with CM, for the synthesis of diverse hydrogen evolution reaction(HER) catalysts and various hydrogen production processes(HPPs). Furthermore, this review addresses a notable gap in the literature by offering insights, analyzing challenges, and identifying research prospects and opportunities for sustainable hydrogen production. While the literature reflects a promising landscape for ML applications in hydrogen energy domains, transitioning AI-based algorithms from controlled environments to real-world applications poses significant challenges. Hence, this comprehensive review delves into the technical,practical, and ethical considerations associated with the application of ML in HER catalyst development and HPP optimization. Overall, this review provides guidance for unlocking the transformative potential of ML in enhancing prediction efficiency and sustainability in the hydrogen production sector.展开更多
With the rise of artificial intelligence(AI)in mineral processing,predicting the flotation indexes has attracted significant research attention.Nevertheless,current prediction models suffer from low accuracy and high ...With the rise of artificial intelligence(AI)in mineral processing,predicting the flotation indexes has attracted significant research attention.Nevertheless,current prediction models suffer from low accuracy and high prediction errors.Therefore,this paper utilizes a two-step procedure.First,the outliers are pro-cessed using the box chart method and filtering algorithm.Then,the decision tree(DT),support vector regression(SVR),random forest(RF),and the bagging,boosting,and stacking integration algorithms are employed to construct a flotation recovery prediction model.Extensive experiments compared the prediction accuracy of six modeling methods on flotation recovery and delved into the impact of diverse base model combinations on the stacking model’s prediction accuracy.In addition,field data have veri-fied the model’s effectiveness.This study demonstrates that the stacking ensemble approaches,which uses ten variables to predict flotation recovery,yields a more favorable prediction effect than the bagging ensemble approach and single models,achieving MAE,RMSE,R2,and MRE scores of 0.929,1.370,0.843,and 1.229%,respectively.The hit rates,within an error range of±2%and±4%,are 82.4%and 94.6%.Consequently,the prediction effect is relatively precise and offers significant value in the context of actual production.展开更多
Due to the complexity and variability of carbonate formation leakage zones, lost circulation prediction and control is one of the major challenges of carbonate drilling. It raises well-control risks and production exp...Due to the complexity and variability of carbonate formation leakage zones, lost circulation prediction and control is one of the major challenges of carbonate drilling. It raises well-control risks and production expenses. This research utilizes the H oilfield as an example, employs seismic features to analyze mud loss prediction, and produces a complete set of pre-drilling mud loss prediction solutions. Firstly, 16seismic attributes are calculated based on the post-stack seismic data, and the mud loss rate per unit footage is specified. The sample set is constructed by extracting each attribute from the seismic trace surrounding 15 typical wells, with a ratio of 8:2 between the training set and the test set. With the calibration results for mud loss rate per unit footage, the nonlinear mapping relationship between seismic attributes and mud loss rate per unit size is established using the mixed density network model.Then, the influence of the number of sub-Gausses and the uncertainty coefficient on the model's prediction is evaluated. Finally, the model is used in conjunction with downhole drilling conditions to assess the risk of mud loss in various layers and along the wellbore trajectory. The study demonstrates that the mean relative errors of the model for training data and test data are 6.9% and 7.5%, respectively, and that R2is 90% and 88%, respectively, for training data and test data. The accuracy and efficacy of mud loss prediction may be greatly enhanced by combining 16 seismic attributes with the mud loss rate per unit footage and applying machine learning methods. The mud loss prediction model based on the MDN model can not only predict the mud loss rate but also objectively evaluate the prediction based on the quality of the data and the model.展开更多
Pore pressure is essential data in drilling design,and its accurate prediction is necessary to ensure drilling safety and improve drilling efficiency.Traditional methods for predicting pore pressure are limited when f...Pore pressure is essential data in drilling design,and its accurate prediction is necessary to ensure drilling safety and improve drilling efficiency.Traditional methods for predicting pore pressure are limited when forming particular structures and lithology.In this paper,a machine learning algorithm and effective stress theorem are used to establish the transformation model between rock physical parameters and pore pressure.This study collects data from three wells.Well 1 had 881 data sets for model training,and Wells 2 and 3 had 538 and 464 data sets for model testing.In this paper,support vector machine(SVM),random forest(RF),extreme gradient boosting(XGB),and multilayer perceptron(MLP)are selected as the machine learning algorithms for pore pressure modeling.In addition,this paper uses the grey wolf optimization(GWO)algorithm,particle swarm optimization(PSO)algorithm,sparrow search algorithm(SSA),and bat algorithm(BA)to establish a hybrid machine learning optimization algorithm,and proposes an improved grey wolf optimization(IGWO)algorithm.The IGWO-MLP model obtained the minimum root mean square error(RMSE)by using the 5-fold cross-validation method for the training data.For the pore pressure data in Well 2 and Well 3,the coefficients of determination(R^(2))of SVM,RF,XGB,and MLP are 0.9930 and 0.9446,0.9943 and 0.9472,0.9945 and 0.9488,0.9949 and 0.9574.MLP achieves optimal performance on both training and test data,and the MLP model shows a high degree of generalization.It indicates that the IGWO-MLP is an excellent predictor of pore pressure and can be used to predict pore pressure.展开更多
Accurate aging diagnosis is crucial for the health and safety management of lithium-ion batteries in electric vehicles.Despite significant advancements achieved by data-driven methods,diagnosis accuracy remains constr...Accurate aging diagnosis is crucial for the health and safety management of lithium-ion batteries in electric vehicles.Despite significant advancements achieved by data-driven methods,diagnosis accuracy remains constrained by the high costs of check-up tests and the scarcity of labeled data.This paper presents a framework utilizing self-supervised machine learning to harness the potential of unlabeled data for diagnosing battery aging in electric vehicles during field operations.We validate our method using battery degradation datasets collected over more than two years from twenty real-world electric vehicles.Our analysis comprehensively addresses cell inconsistencies,physical interpretations,and charging uncertainties in real-world applications.This is achieved through self-supervised feature extraction using random short charging sequences in the main peak of incremental capacity curves.By leveraging inexpensive unlabeled data in a self-supervised approach,our method demonstrates improvements in average root mean square errors of 74.54%and 60.50%in the best and worst cases,respectively,compared to the supervised benchmark.This work underscores the potential of employing low-cost unlabeled data with self-supervised machine learning for effective battery health and safety management in realworld scenarios.展开更多
基金financial support from the National Key Research and Development Program of China(2021YFB 3501501)the National Natural Science Foundation of China(No.22225803,22038001,22108007 and 22278011)+1 种基金Beijing Natural Science Foundation(No.Z230023)Beijing Science and Technology Commission(No.Z211100004321001).
文摘The high porosity and tunable chemical functionality of metal-organic frameworks(MOFs)make it a promising catalyst design platform.High-throughput screening of catalytic performance is feasible since the large MOF structure database is available.In this study,we report a machine learning model for high-throughput screening of MOF catalysts for the CO_(2) cycloaddition reaction.The descriptors for model training were judiciously chosen according to the reaction mechanism,which leads to high accuracy up to 97%for the 75%quantile of the training set as the classification criterion.The feature contribution was further evaluated with SHAP and PDP analysis to provide a certain physical understanding.12,415 hypothetical MOF structures and 100 reported MOFs were evaluated under 100℃ and 1 bar within one day using the model,and 239 potentially efficient catalysts were discovered.Among them,MOF-76(Y)achieved the top performance experimentally among reported MOFs,in good agreement with the prediction.
基金Applicable Funding Source University of Science and Technology of China(to YLL)National Natural Science Foundation of China(12126604)(to MPZ)+1 种基金R&D project of Pazhou Lab(Huangpu)(2023K0609)(to MPZ)Anhui Provincial Natural Science(grant number 2208085MH235)(to KJ)。
文摘BACKGROUND:Rapid and accurate identification of high-risk patients in the emergency departments(EDs)is crucial for optimizing resource allocation and improving patient outcomes.This study aimed to develop an early prediction model for identifying high-risk patients in EDs using initial vital sign measurements.METHODS:This retrospective cohort study analyzed initial vital signs from the Chinese Emergency Triage,Assessment,and Treatment(CETAT)database,which was collected between January 1^(st),2020,and June 25^(th),2023.The primary outcome was the identification of high-risk patients needing immediate treatment.Various machine learning methods,including a deep-learningbased multilayer perceptron(MLP)classifier were evaluated.Model performance was assessed using the area under the receiver operating characteristic curve(AUC-ROC).AUC-ROC values were reported for three scenarios:a default case,a scenario requiring sensitivity greater than 0.8(Scenario I),and a scenario requiring specificity greater than 0.8(Scenario II).SHAP values were calculated to determine the importance of each predictor within the MLP model.RESULTS:A total of 38,797 patients were analyzed,of whom 18.2%were identified as high-risk.Comparative analysis of the predictive models for high-risk patients showed AUC-ROC values ranging from 0.717 to 0.738,with the MLP model outperforming logistic regression(LR),Gaussian Naive Bayes(GNB),and the National Early Warning Score(NEWS).SHAP value analysis identified coma state,peripheral capillary oxygen saturation(SpO_(2)),and systolic blood pressure as the top three predictive factors in the MLP model,with coma state exerting the most contribution.CONCLUSION:Compared with other methods,the MLP model with initial vital signs demonstrated optimal prediction accuracy,highlighting its potential to enhance clinical decision-making in triage in the EDs.
基金National Research Foundation of Korea (NRF) grant (No. 2016R1A3B 1908249) funded by the Korean government。
文摘Organic solar cells(OSCs) hold great potential as a photovoltaic technology for practical applications.However, the traditional experimental trial-and-error method for designing and engineering OSCs can be complex, expensive, and time-consuming. Machine learning(ML) techniques enable the proficient extraction of information from datasets, allowing the development of realistic models that are capable of predicting the efficacy of materials with commendable accuracy. The PM6 donor has great potential for high-performance OSCs. However, it is crucial for the rational design of a ternary blend to accurately forecast the power conversion efficiency(PCE) of ternary OSCs(TOSCs) based on a PM6 donor.Accordingly, we collected the device parameters of PM6-based TOSCs and evaluated the feature importance of their molecule descriptors to develop predictive models. In this study, we used five different ML algorithms for analysis and prediction. For the analysis, the classification and regression tree provided different rules, heuristics, and patterns from the heterogeneous dataset. The random forest algorithm outperforms other prediction ML algorithms in predicting the output performance of PM6-based TOSCs. Finally, we validated the ML outcomes by fabricating PM6-based TOSCs. Our study presents a rapid strategy for assessing a high PCE while elucidating the substantial influence of diverse descriptors.
基金Project supported by the National Natural Science Foundation of China (Grant Nos. 12222506, 12347102, 12447164, and 12174184)。
文摘Accurately estimating protein–ligand binding free energy is crucial for drug design and biophysics, yet remains a challenging task. In this study, we applied the screening molecular mechanics/Poisson–Boltzmann surface area(MM/PBSA)method in combination with various machine learning techniques to compute the binding free energies of protein–ligand interactions. Our results demonstrate that machine learning outperforms direct screening MM/PBSA calculations in predicting protein–ligand binding free energies. Notably, the random forest(RF) method exhibited the best predictive performance,with a Pearson correlation coefficient(rp) of 0.702 and a mean absolute error(MAE) of 1.379 kcal/mol. Furthermore, we analyzed feature importance rankings in the gradient boosting(GB), adaptive boosting(Ada Boost), and RF methods, and found that feature selection significantly impacted predictive performance. In particular, molecular weight(MW) and van der Waals(VDW) energies played a decisive role in the prediction. Overall, this study highlights the potential of combining machine learning methods with screening MM/PBSA for accurately predicting binding free energies in biosystems.
基金financially supported by the National Key Research and Development Program of China(Grant No.2022YFE0129800)the National Natural Science Foundation of China(Grant No.42202204)。
文摘In-situ upgrading by heating is feasible for low-maturity shale oil,where the pore space dynamically evolves.We characterize this response for a heated substrate concurrently imaged by SEM.We systematically follow the evolution of pore quantity,size(length,width and cross-sectional area),orientation,shape(aspect ratio,roundness and solidity)and their anisotropy—interpreted by machine learning.Results indicate that heating generates new pores in both organic matter and inorganic minerals.However,the newly formed pores are smaller than the original pores and thus reduce average lengths and widths of the bedding-parallel pore system.Conversely,the average pore lengths and widths are increased in the bedding-perpendicular direction.Besides,heating increases the cross-sectional area of pores in low-maturity oil shales,where this growth tendency fluctuates at<300℃ but becomes steady at>300℃.In addition,the orientation and shape of the newly-formed heating-induced pores follow the habit of the original pores and follow the initial probability distributions of pore orientation and shape.Herein,limited anisotropy is detected in pore direction and shape,indicating similar modes of evolution both bedding-parallel and bedding-normal.We propose a straightforward but robust model to describe evolution of pore system in low-maturity oil shales during heating.
文摘Machine learning(ML) is well suited for the prediction of high-complexity,high-dimensional problems such as those encountered in terminal ballistics.We evaluate the performance of four popular ML-based regression models,extreme gradient boosting(XGBoost),artificial neural network(ANN),support vector regression(SVR),and Gaussian process regression(GP),on two common terminal ballistics’ problems:(a)predicting the V50ballistic limit of monolithic metallic armour impacted by small and medium calibre projectiles and fragments,and(b) predicting the depth to which a projectile will penetrate a target of semi-infinite thickness.To achieve this we utilise two datasets,each consisting of approximately 1000samples,collated from public release sources.We demonstrate that all four model types provide similarly excellent agreement when interpolating within the training data and diverge when extrapolating outside this range.Although extrapolation is not advisable for ML-based regression models,for applications such as lethality/survivability analysis,such capability is required.To circumvent this,we implement expert knowledge and physics-based models via enforced monotonicity,as a Gaussian prior mean,and through a modified loss function.The physics-informed models demonstrate improved performance over both classical physics-based models and the basic ML regression models,providing an ability to accurately fit experimental data when it is available and then revert to the physics-based model when not.The resulting models demonstrate high levels of predictive accuracy over a very wide range of projectile types,target materials and thicknesses,and impact conditions significantly more diverse than that achievable from any existing analytical approach.Compared with numerical analysis tools such as finite element solvers the ML models run orders of magnitude faster.We provide some general guidelines throughout for the development,application,and reporting of ML models in terminal ballistics problems.
基金EP-A and JMT-R acknowledges financial support from the project PID2021-128062NB-I00 funded by MCIN/AEI/10.13039/501100011033The lunar samples studied here were acquired in the framework of grant PGC2018-097374-B-I00(P.I.JMT-R)+3 种基金This project has received funding from the European Research Council(ERC)under the European Union’s Horizon 2020 research and innovation programme(No.865657)for the project“Quantum Chemistry on Interstellar Grains”(QUANTUMGRAIN),AR acknowledges financial support from the FEDER/Ministerio de Ciencia e Innovación-Agencia Estatal de Investigación(No.PID2021-126427NB-I00)Partial financial support from the Spanish Government(No.PID2020-116844RB-C21)the Generalitat de Catalunya(No.2021-SGR-00651)is acknowledgedThis work was supported by the LUMIO project funded by the Agenzia Spaziale Italiana(No.2024-6-HH.0).
文摘Amid the scarcity of lunar meteorites and the imperative to preserve their scientific value,nondestructive testing methods are essential.This translates into the application of microscale rock mechanics experiments and scanning electron microscopy for surface composition analysis.This study explores the application of Machine Learning algorithms in predicting the mineralogical and mechanical properties of DHOFAR 1084,JAH 838,and NWA 11444 lunar meteorites based solely on their atomic percentage compositions.Leveraging a prior-data fitted network model,we achieved near-perfect classification scores for meteorites,mineral groups,and individual minerals.The regressor models,notably the KNeighbor model,provided an outstanding estimate of the mechanical properties—previously measured by nanoindentation tests—such as hardness,reduced Young’s modulus,and elastic recovery.Further considerations on the nature and physical properties of the minerals forming these meteorites,including porosity,crystal orientation,or shock degree,are essential for refining predictions.Our findings underscore the potential of Machine Learning in enhancing mineral identification and mechanical property estimation in lunar exploration,which pave the way for new advancements and quick assessments in extraterrestrial mineral mining,processing,and research.
基金supported by the National Natural Science Foundation of China(52203364,52188101,52020105010)the National Key R&D Program of China(2021YFB3800300,2022YFB3803400)+2 种基金the Strategic Priority Research Program of Chinese Academy of Science(XDA22010602)the China Postdoctoral Science Foundation(2022M713214)the China National Postdoctoral Program for Innovative Talents(BX2021321)。
文摘Metal-ion batteries(MIBs),including alkali metal-ion(Li^(+),Na^(+),and K^(3)),multi-valent metal-ion(Zn^(2+),Mg^(2+),and Al^(3+)),metal-air,and metal-sulfur batteries,play an indispensable role in electrochemical energy storage.However,the performance of MIBs is significantly influenced by numerous variables,resulting in multi-dimensional and long-term challenges in the field of battery research and performance enhancement.Machine learning(ML),with its capability to solve intricate tasks and perform robust data processing,is now catalyzing a revolutionary transformation in the development of MIB materials and devices.In this review,we summarize the utilization of ML algorithms that have expedited research on MIBs over the past five years.We present an extensive overview of existing algorithms,elucidating their details,advantages,and limitations in various applications,which encompass electrode screening,material property prediction,electrolyte formulation design,electrode material characterization,manufacturing parameter optimization,and real-time battery status monitoring.Finally,we propose potential solutions and future directions for the application of ML in advancing MIB development.
文摘The advent of pandemics such as COVID-19 significantly impacts human behaviour and lives every day.Therefore,it is essential to make medical services connected to internet,available in every remote location during these situations.Also,the security issues in the Internet of Medical Things(IoMT)used in these service,make the situation even more critical because cyberattacks on the medical devices might cause treatment delays or clinical failures.Hence,services in the healthcare ecosystem need rapid,uninterrupted,and secure facilities.The solution provided in this research addresses security concerns and services availability for patients with critical health in remote areas.This research aims to develop an intelligent Software Defined Networks(SDNs)enabled secure framework for IoT healthcare ecosystem.We propose a hybrid of machine learning and deep learning techniques(DNN+SVM)to identify network intrusions in the sensor-based healthcare data.In addition,this system can efficiently monitor connected devices and suspicious behaviours.Finally,we evaluate the performance of our proposed framework using various performance metrics based on the healthcare application scenarios.the experimental results show that the proposed approach effectively detects and mitigates attacks in the SDN-enabled IoT networks and performs better that other state-of-art-approaches.
基金This work is supported by the National Key R&D Program of China(No.2022ZD0117501)the Singapore RIE2020 Advanced Manufacturing and Engineering Programmatic Grant by the Agency for Science,Technology and Research(A*STAR)under grant no.A1898b0043Tsinghua University Initiative Scientific Research Program and Low Carbon En-ergy Research Funding Initiative by A*STAR under grant number A-8000182-00-00.
文摘Membrane technologies are becoming increasingly versatile and helpful today for sustainable development.Machine Learning(ML),an essential branch of artificial intelligence(AI),has substantially impacted the research and development norm of new materials for energy and environment.This review provides an overview and perspectives on ML methodologies and their applications in membrane design and dis-covery.A brief overview of membrane technologies isfirst provided with the current bottlenecks and potential solutions.Through an appli-cations-based perspective of AI-aided membrane design and discovery,we further show how ML strategies are applied to the membrane discovery cycle(including membrane material design,membrane application,membrane process design,and knowledge extraction),in various membrane systems,ranging from gas,liquid,and fuel cell separation membranes.Furthermore,the best practices of integrating ML methods and specific application targets in membrane design and discovery are presented with an ideal paradigm proposed.The challenges to be addressed and prospects of AI applications in membrane discovery are also highlighted in the end.
基金supported by the National Natural Science Foundation of China(22278241)the National Key R&D Program of China(2018YFA0901700)+1 种基金a grant from the Institute Guo Qiang,Tsinghua University(2021GQG1016)Department of Chemical Engineering-iBHE Joint Cooperation Fund.
文摘Early non-invasive diagnosis of coronary heart disease(CHD)is critical.However,it is challenging to achieve accurate CHD diagnosis via detecting breath.In this work,heterostructured complexes of black phosphorus(BP)and two-dimensional carbide and nitride(MXene)with high gas sensitivity and photo responsiveness were formulated using a self-assembly strategy.A light-activated virtual sensor array(LAVSA)based on BP/Ti_(3)C_(2)Tx was prepared under photomodulation and further assembled into an instant gas sensing platform(IGSP).In addition,a machine learning(ML)algorithm was introduced to help the IGSP detect and recognize the signals of breath samples to diagnose CHD.Due to the synergistic effect of BP and Ti_(3)C_(2)Tx as well as photo excitation,the synthesized heterostructured complexes exhibited higher performance than pristine Ti_(3)C_(2)Tx,with a response value 26%higher than that of pristine Ti_(3)C_(2)Tx.In addition,with the help of a pattern recognition algorithm,LAVSA successfully detected and identified 15 odor molecules affiliated with alcohols,ketones,aldehydes,esters,and acids.Meanwhile,with the assistance of ML,the IGSP achieved 69.2%accuracy in detecting the breath odor of 45 volunteers from healthy people and CHD patients.In conclusion,an immediate,low-cost,and accurate prototype was designed and fabricated for the noninvasive diagnosis of CHD,which provided a generalized solution for diagnosing other diseases and other more complex application scenarios.
基金the financial support from the National Natural Science Foundation of China(52207229)the financial support from the China Scholarship Council(202207550010)。
文摘The safe and reliable operation of lithium-ion batteries necessitates the accurate prediction of remaining useful life(RUL).However,this task is challenging due to the diverse ageing mechanisms,various operating conditions,and limited measured signals.Although data-driven methods are perceived as a promising solution,they ignore intrinsic battery physics,leading to compromised accuracy,low efficiency,and low interpretability.In response,this study integrates domain knowledge into deep learning to enhance the RUL prediction performance.We demonstrate accurate RUL prediction using only a single charging curve.First,a generalisable physics-based model is developed to extract ageing-correlated parameters that can describe and explain battery degradation from battery charging data.The parameters inform a deep neural network(DNN)to predict RUL with high accuracy and efficiency.The trained model is validated under 3 types of batteries working under 7 conditions,considering fully charged and partially charged cases.Using data from one cycle only,the proposed method achieves a root mean squared error(RMSE)of 11.42 cycles and a mean absolute relative error(MARE)of 3.19%on average,which are over45%and 44%lower compared to the two state-of-the-art data-driven methods,respectively.Besides its accuracy,the proposed method also outperforms existing methods in terms of efficiency,input burden,and robustness.The inherent relationship between the model parameters and the battery degradation mechanism is further revealed,substantiating the intrinsic superiority of the proposed method.
基金National Natural Science Foundation of China(Nos.52275346 and 52075287)Tsinghua University Initiative Scientific Research Program(20221080070).
文摘The recent wave of the artificial intelligence(AI)revolution has aroused unprecedented interest in the intelligentialize of human society.As an essential component that bridges the physical world and digital signals,flexible sensors are evolving from a single sensing element to a smarter system,which is capable of highly efficient acquisition,analysis,and even perception of vast,multifaceted data.While challenging from a manual perspective,the development of intelligent flexible sensing has been remarkably facilitated owing to the rapid advances of brain-inspired AI innovations from both the algorithm(machine learning)and the framework(artificial synapses)level.This review presents the recent progress of the emerging AI-driven,intelligent flexible sensing systems.The basic concept of machine learning and artificial synapses are introduced.The new enabling features induced by the fusion of AI and flexible sensing are comprehensively reviewed,which significantly advances the applications such as flexible sensory systems,soft/humanoid robotics,and human activity monitoring.As two of the most profound innovations in the twenty-first century,the deep incorporation of flexible sensing and AI technology holds tremendous potential for creating a smarter world for human beings.
基金funded by the Strategic Priority Research Program of the Chinese Academy of Sciences (Grant No.XDA17040506)the National Natural Science Foundation of China(62005148/12004235)+2 种基金The Open Competition Mechanism to Select The Best Candidates Project in Jinzhong Science and Technology Bureau (J202101)the DNL Cooperation Fund CAS(DNL180311)the 111 Project (B14041)
文摘Metal-halide hybrid perovskite materials are excellent candidates for solar cells and photoelectric devices.In recent years,machine learning(ML)techniques have developed rapidly in many fields and provided ideas for material discovery and design.ML can be applied to discover new materials quickly and effectively,with significant savings in resources and time compared with traditional experiments and density functional theory(DFT)calculations.In this review,we present the application of ML in per-ovskites and briefly review the recent works in the field of ML-assisted perovskite design.Firstly,the advantages of perovskites in solar cells and the merits of ML applied to perovskites are discussed.Secondly,the workflow of ML in perovskite design and some basic ML algorithms are introduced.Thirdly,the applications of ML in predicting various properties of perovskite materials and devices are reviewed.Finally,we propose some prospects for the future development of this field.The rapid devel-opment of ML technology will largely promote the process of materials science,and ML will become an increasingly popular method for predicting the target properties of materials and devices.
基金supported by the Fundamental Research Funds for the National Major Science and Technology Projects of China (No. 2017ZX05009-005)。
文摘The application of carbon dioxide(CO_(2)) in enhanced oil recovery(EOR) has increased significantly, in which CO_(2) solubility in oil is a key parameter in predicting CO_(2) flooding performance. Hydrocarbons are the major constituents of oil, thus the focus of this work lies in investigating the solubility of CO_(2) in hydrocarbons. However, current experimental measurements are time-consuming, and equations of state can be computationally complex. To address these challenges, we developed an artificial intelligence-based model to predict the solubility of CO_(2) in hydrocarbons under varying conditions of temperature, pressure, molecular weight, and density. Using experimental data from previous studies,we trained and predicted the solubility using four machine learning models: support vector regression(SVR), extreme gradient boosting(XGBoost), random forest(RF), and multilayer perceptron(MLP).Among four models, the XGBoost model has the best predictive performance, with an R^(2) of 0.9838.Additionally, sensitivity analysis and evaluation of the relative impacts of each input parameter indicate that the prediction of CO_(2) solubility in hydrocarbons is most sensitive to pressure. Furthermore, our trained model was compared with existing models, demonstrating higher accuracy and applicability of our model. The developed machine learning-based model provides a more efficient and accurate approach for predicting CO_(2) solubility in hydrocarbons, which may contribute to the advancement of CO_(2)-related applications in the petroleum industry.
基金express their gratitude to the Higher Institution Centre of Excellence (HICoE) fund under the project code (JPT.S(BPKI)2000/016/018/015JId.4(21)/2022002HICOE)Universiti Tenaga Nasional (UNITEN) for funding the research through the (J510050002–IC–6 BOLDREFRESH2025)Akaun Amanah Industri Bekalan Elektrik (AAIBE) Chair of Renewable Energy grant,and NEC Energy Transition Grant (202203003ETG)。
文摘With the projected global surge in hydrogen demand, driven by increasing applications and the imperative for low-emission hydrogen, the integration of machine learning(ML) across the hydrogen energy value chain is a compelling avenue. This review uniquely focuses on harnessing the synergy between ML and computational modeling(CM) or optimization tools, as well as integrating multiple ML techniques with CM, for the synthesis of diverse hydrogen evolution reaction(HER) catalysts and various hydrogen production processes(HPPs). Furthermore, this review addresses a notable gap in the literature by offering insights, analyzing challenges, and identifying research prospects and opportunities for sustainable hydrogen production. While the literature reflects a promising landscape for ML applications in hydrogen energy domains, transitioning AI-based algorithms from controlled environments to real-world applications poses significant challenges. Hence, this comprehensive review delves into the technical,practical, and ethical considerations associated with the application of ML in HER catalyst development and HPP optimization. Overall, this review provides guidance for unlocking the transformative potential of ML in enhancing prediction efficiency and sustainability in the hydrogen production sector.
基金supported by the National Key R&D Program of China(No.2023YFC2908200)National Natural Science Foundation of China(No.52174249)Key Research and Development Program of Jiangxi Province(No.20203BBGL73231).
文摘With the rise of artificial intelligence(AI)in mineral processing,predicting the flotation indexes has attracted significant research attention.Nevertheless,current prediction models suffer from low accuracy and high prediction errors.Therefore,this paper utilizes a two-step procedure.First,the outliers are pro-cessed using the box chart method and filtering algorithm.Then,the decision tree(DT),support vector regression(SVR),random forest(RF),and the bagging,boosting,and stacking integration algorithms are employed to construct a flotation recovery prediction model.Extensive experiments compared the prediction accuracy of six modeling methods on flotation recovery and delved into the impact of diverse base model combinations on the stacking model’s prediction accuracy.In addition,field data have veri-fied the model’s effectiveness.This study demonstrates that the stacking ensemble approaches,which uses ten variables to predict flotation recovery,yields a more favorable prediction effect than the bagging ensemble approach and single models,achieving MAE,RMSE,R2,and MRE scores of 0.929,1.370,0.843,and 1.229%,respectively.The hit rates,within an error range of±2%and±4%,are 82.4%and 94.6%.Consequently,the prediction effect is relatively precise and offers significant value in the context of actual production.
基金the financially supported by the National Natural Science Foundation of China(Grant No.52104013)the China Postdoctoral Science Foundation(Grant No.2022T150724)。
文摘Due to the complexity and variability of carbonate formation leakage zones, lost circulation prediction and control is one of the major challenges of carbonate drilling. It raises well-control risks and production expenses. This research utilizes the H oilfield as an example, employs seismic features to analyze mud loss prediction, and produces a complete set of pre-drilling mud loss prediction solutions. Firstly, 16seismic attributes are calculated based on the post-stack seismic data, and the mud loss rate per unit footage is specified. The sample set is constructed by extracting each attribute from the seismic trace surrounding 15 typical wells, with a ratio of 8:2 between the training set and the test set. With the calibration results for mud loss rate per unit footage, the nonlinear mapping relationship between seismic attributes and mud loss rate per unit size is established using the mixed density network model.Then, the influence of the number of sub-Gausses and the uncertainty coefficient on the model's prediction is evaluated. Finally, the model is used in conjunction with downhole drilling conditions to assess the risk of mud loss in various layers and along the wellbore trajectory. The study demonstrates that the mean relative errors of the model for training data and test data are 6.9% and 7.5%, respectively, and that R2is 90% and 88%, respectively, for training data and test data. The accuracy and efficacy of mud loss prediction may be greatly enhanced by combining 16 seismic attributes with the mud loss rate per unit footage and applying machine learning methods. The mud loss prediction model based on the MDN model can not only predict the mud loss rate but also objectively evaluate the prediction based on the quality of the data and the model.
文摘Pore pressure is essential data in drilling design,and its accurate prediction is necessary to ensure drilling safety and improve drilling efficiency.Traditional methods for predicting pore pressure are limited when forming particular structures and lithology.In this paper,a machine learning algorithm and effective stress theorem are used to establish the transformation model between rock physical parameters and pore pressure.This study collects data from three wells.Well 1 had 881 data sets for model training,and Wells 2 and 3 had 538 and 464 data sets for model testing.In this paper,support vector machine(SVM),random forest(RF),extreme gradient boosting(XGB),and multilayer perceptron(MLP)are selected as the machine learning algorithms for pore pressure modeling.In addition,this paper uses the grey wolf optimization(GWO)algorithm,particle swarm optimization(PSO)algorithm,sparrow search algorithm(SSA),and bat algorithm(BA)to establish a hybrid machine learning optimization algorithm,and proposes an improved grey wolf optimization(IGWO)algorithm.The IGWO-MLP model obtained the minimum root mean square error(RMSE)by using the 5-fold cross-validation method for the training data.For the pore pressure data in Well 2 and Well 3,the coefficients of determination(R^(2))of SVM,RF,XGB,and MLP are 0.9930 and 0.9446,0.9943 and 0.9472,0.9945 and 0.9488,0.9949 and 0.9574.MLP achieves optimal performance on both training and test data,and the MLP model shows a high degree of generalization.It indicates that the IGWO-MLP is an excellent predictor of pore pressure and can be used to predict pore pressure.
基金supported by the research project‘‘SafeDaBatt”(03EMF0409A)funded by the German Federal Ministry for Digital and Transport(BMDV)+2 种基金the National Key Research and Development Program of China(2022YFE0102700)the Key Research and Development Program of Shaanxi Province(2023-GHYB-05,2023-YBSF-104)the financial support from the China Scholarship Council(CSC)(202206567008)。
文摘Accurate aging diagnosis is crucial for the health and safety management of lithium-ion batteries in electric vehicles.Despite significant advancements achieved by data-driven methods,diagnosis accuracy remains constrained by the high costs of check-up tests and the scarcity of labeled data.This paper presents a framework utilizing self-supervised machine learning to harness the potential of unlabeled data for diagnosing battery aging in electric vehicles during field operations.We validate our method using battery degradation datasets collected over more than two years from twenty real-world electric vehicles.Our analysis comprehensively addresses cell inconsistencies,physical interpretations,and charging uncertainties in real-world applications.This is achieved through self-supervised feature extraction using random short charging sequences in the main peak of incremental capacity curves.By leveraging inexpensive unlabeled data in a self-supervised approach,our method demonstrates improvements in average root mean square errors of 74.54%and 60.50%in the best and worst cases,respectively,compared to the supervised benchmark.This work underscores the potential of employing low-cost unlabeled data with self-supervised machine learning for effective battery health and safety management in realworld scenarios.