期刊文献+
共找到45,062篇文章
< 1 2 250 >
每页显示 20 50 100
High-throughput screening of CO_(2) cycloaddition MOF catalyst with an explainable machine learning model
1
作者 Xuefeng Bai Yi Li +3 位作者 Yabo Xie Qiancheng Chen Xin Zhang Jian-Rong Li 《Green Energy & Environment》 SCIE EI CAS 2025年第1期132-138,共7页
The high porosity and tunable chemical functionality of metal-organic frameworks(MOFs)make it a promising catalyst design platform.High-throughput screening of catalytic performance is feasible since the large MOF str... The high porosity and tunable chemical functionality of metal-organic frameworks(MOFs)make it a promising catalyst design platform.High-throughput screening of catalytic performance is feasible since the large MOF structure database is available.In this study,we report a machine learning model for high-throughput screening of MOF catalysts for the CO_(2) cycloaddition reaction.The descriptors for model training were judiciously chosen according to the reaction mechanism,which leads to high accuracy up to 97%for the 75%quantile of the training set as the classification criterion.The feature contribution was further evaluated with SHAP and PDP analysis to provide a certain physical understanding.12,415 hypothetical MOF structures and 100 reported MOFs were evaluated under 100℃ and 1 bar within one day using the model,and 239 potentially efficient catalysts were discovered.Among them,MOF-76(Y)achieved the top performance experimentally among reported MOFs,in good agreement with the prediction. 展开更多
关键词 Metal-organic frameworks High-throughput screening machine learning Explainable model CO_(2)cycloaddition
在线阅读 下载PDF
Machine learning empowers efficient design of ternary organic solar cells with PM6 donor
2
作者 Kiran A.Nirmal Tukaram D.Dongale +2 位作者 Santosh S.Sutar Atul C.Khot Tae Geun Kim 《Journal of Energy Chemistry》 2025年第1期337-347,共11页
Organic solar cells(OSCs) hold great potential as a photovoltaic technology for practical applications.However, the traditional experimental trial-and-error method for designing and engineering OSCs can be complex, ex... Organic solar cells(OSCs) hold great potential as a photovoltaic technology for practical applications.However, the traditional experimental trial-and-error method for designing and engineering OSCs can be complex, expensive, and time-consuming. Machine learning(ML) techniques enable the proficient extraction of information from datasets, allowing the development of realistic models that are capable of predicting the efficacy of materials with commendable accuracy. The PM6 donor has great potential for high-performance OSCs. However, it is crucial for the rational design of a ternary blend to accurately forecast the power conversion efficiency(PCE) of ternary OSCs(TOSCs) based on a PM6 donor.Accordingly, we collected the device parameters of PM6-based TOSCs and evaluated the feature importance of their molecule descriptors to develop predictive models. In this study, we used five different ML algorithms for analysis and prediction. For the analysis, the classification and regression tree provided different rules, heuristics, and patterns from the heterogeneous dataset. The random forest algorithm outperforms other prediction ML algorithms in predicting the output performance of PM6-based TOSCs. Finally, we validated the ML outcomes by fabricating PM6-based TOSCs. Our study presents a rapid strategy for assessing a high PCE while elucidating the substantial influence of diverse descriptors. 展开更多
关键词 machine learning Ternary organic solarcells PM6 donor PCE
在线阅读 下载PDF
Prediction of the first 2^(+) states properties for atomic nuclei using light gradient boosting machine
3
作者 Hui Liu Xin-Xiang Li +2 位作者 Yun Yuan Wen Luo Yi Xu 《Nuclear Science and Techniques》 2025年第2期95-102,共8页
The first 2^(+)excited states of the nucleus directly reflect the interaction between the shell structure and the nucleus,providing insights into the validity of the shell model and nuclear structure characteristics.A... The first 2^(+)excited states of the nucleus directly reflect the interaction between the shell structure and the nucleus,providing insights into the validity of the shell model and nuclear structure characteristics.Although the features of the first 2^(+)excited states can be measured for stable nuclei and calculated using nuclear models,significant uncertainty remains.This study employs a machine learning model based on a light gradient boosting machine(LightGBM)to investigate the first 2^(+)excited states.Specifically,the training of the LightGBM algorithm and the prediction of the first 2^(+)properties of 642 nuclei are presented.Furthermore,detailed comparisons of the LightGBM predictions were performed with available experimental data,shell model calculations,and Bayesian neural network predictions.The results revealed that the average difference between the LightGBM predictions and the experimental data was 18 times smaller than that obtained by the shell model and only 70%of the BNN prediction results.Considering Mg,Ca,Kr,Sm,and Pb isotopes as examples,it was also observed that LightGBM can effectively reproduce the magic number mutation caused by shell effects,with the energy being as low as 0.04 MeV due to shape coexistence.Therefore,we believe that leveraging LightGBM-based machine learning can profoundly enhance our insights into nuclear structures and provide new avenues for nuclear physics research. 展开更多
关键词 First 2^(+) state Nuclear levels Light gradient boosting machine
在线阅读 下载PDF
Machine learning for predicting the outcome of terminal ballistics events 被引量:2
4
作者 Shannon Ryan Neeraj Mohan Sushma +4 位作者 Arun Kumar AV Julian Berk Tahrima Hashem Santu Rana Svetha Venkatesh 《Defence Technology(防务技术)》 SCIE EI CAS CSCD 2024年第1期14-26,共13页
Machine learning(ML) is well suited for the prediction of high-complexity,high-dimensional problems such as those encountered in terminal ballistics.We evaluate the performance of four popular ML-based regression mode... Machine learning(ML) is well suited for the prediction of high-complexity,high-dimensional problems such as those encountered in terminal ballistics.We evaluate the performance of four popular ML-based regression models,extreme gradient boosting(XGBoost),artificial neural network(ANN),support vector regression(SVR),and Gaussian process regression(GP),on two common terminal ballistics’ problems:(a)predicting the V50ballistic limit of monolithic metallic armour impacted by small and medium calibre projectiles and fragments,and(b) predicting the depth to which a projectile will penetrate a target of semi-infinite thickness.To achieve this we utilise two datasets,each consisting of approximately 1000samples,collated from public release sources.We demonstrate that all four model types provide similarly excellent agreement when interpolating within the training data and diverge when extrapolating outside this range.Although extrapolation is not advisable for ML-based regression models,for applications such as lethality/survivability analysis,such capability is required.To circumvent this,we implement expert knowledge and physics-based models via enforced monotonicity,as a Gaussian prior mean,and through a modified loss function.The physics-informed models demonstrate improved performance over both classical physics-based models and the basic ML regression models,providing an ability to accurately fit experimental data when it is available and then revert to the physics-based model when not.The resulting models demonstrate high levels of predictive accuracy over a very wide range of projectile types,target materials and thicknesses,and impact conditions significantly more diverse than that achievable from any existing analytical approach.Compared with numerical analysis tools such as finite element solvers the ML models run orders of magnitude faster.We provide some general guidelines throughout for the development,application,and reporting of ML models in terminal ballistics problems. 展开更多
关键词 machine learning Artificial intelligence Physics-informed machine learning Terminal ballistics Armour
在线阅读 下载PDF
Machine learning applications on lunar meteorite minerals:From classification to mechanical properties prediction 被引量:1
5
作者 Eloy Peña-Asensio Josep M.Trigo-Rodríguez +2 位作者 Jordi Sort Jordi Ibáñez-Insa Albert Rimola 《International Journal of Mining Science and Technology》 SCIE EI CAS CSCD 2024年第9期1283-1292,共10页
Amid the scarcity of lunar meteorites and the imperative to preserve their scientific value,nondestructive testing methods are essential.This translates into the application of microscale rock mechanics experiments an... Amid the scarcity of lunar meteorites and the imperative to preserve their scientific value,nondestructive testing methods are essential.This translates into the application of microscale rock mechanics experiments and scanning electron microscopy for surface composition analysis.This study explores the application of Machine Learning algorithms in predicting the mineralogical and mechanical properties of DHOFAR 1084,JAH 838,and NWA 11444 lunar meteorites based solely on their atomic percentage compositions.Leveraging a prior-data fitted network model,we achieved near-perfect classification scores for meteorites,mineral groups,and individual minerals.The regressor models,notably the KNeighbor model,provided an outstanding estimate of the mechanical properties—previously measured by nanoindentation tests—such as hardness,reduced Young’s modulus,and elastic recovery.Further considerations on the nature and physical properties of the minerals forming these meteorites,including porosity,crystal orientation,or shock degree,are essential for refining predictions.Our findings underscore the potential of Machine Learning in enhancing mineral identification and mechanical property estimation in lunar exploration,which pave the way for new advancements and quick assessments in extraterrestrial mineral mining,processing,and research. 展开更多
关键词 METEORITES MOON MINERALOGY machine learning Mechanical properties
在线阅读 下载PDF
Machine learning for membrane design and discovery 被引量:1
6
作者 Haoyu Yin Muzi Xu +4 位作者 Zhiyao Luo Xiaotian Bi Jiali Li Sui Zhang Xiaonan Wang 《Green Energy & Environment》 SCIE EI CAS CSCD 2024年第1期54-70,共17页
Membrane technologies are becoming increasingly versatile and helpful today for sustainable development.Machine Learning(ML),an essential branch of artificial intelligence(AI),has substantially impacted the research an... Membrane technologies are becoming increasingly versatile and helpful today for sustainable development.Machine Learning(ML),an essential branch of artificial intelligence(AI),has substantially impacted the research and development norm of new materials for energy and environment.This review provides an overview and perspectives on ML methodologies and their applications in membrane design and dis-covery.A brief overview of membrane technologies isfirst provided with the current bottlenecks and potential solutions.Through an appli-cations-based perspective of AI-aided membrane design and discovery,we further show how ML strategies are applied to the membrane discovery cycle(including membrane material design,membrane application,membrane process design,and knowledge extraction),in various membrane systems,ranging from gas,liquid,and fuel cell separation membranes.Furthermore,the best practices of integrating ML methods and specific application targets in membrane design and discovery are presented with an ideal paradigm proposed.The challenges to be addressed and prospects of AI applications in membrane discovery are also highlighted in the end. 展开更多
关键词 machine learning Membranes AI for Membrane DATA-DRIVEN DESIGN
在线阅读 下载PDF
Machine learning in metal-ion battery research: Advancing material prediction, characterization, and status evaluation 被引量:1
7
作者 Tong Yu Chunyang Wang +1 位作者 Huicong Yang Feng Li 《Journal of Energy Chemistry》 SCIE EI CAS CSCD 2024年第3期191-204,I0006,共15页
Metal-ion batteries(MIBs),including alkali metal-ion(Li^(+),Na^(+),and K^(3)),multi-valent metal-ion(Zn^(2+),Mg^(2+),and Al^(3+)),metal-air,and metal-sulfur batteries,play an indispensable role in electrochemical ener... Metal-ion batteries(MIBs),including alkali metal-ion(Li^(+),Na^(+),and K^(3)),multi-valent metal-ion(Zn^(2+),Mg^(2+),and Al^(3+)),metal-air,and metal-sulfur batteries,play an indispensable role in electrochemical energy storage.However,the performance of MIBs is significantly influenced by numerous variables,resulting in multi-dimensional and long-term challenges in the field of battery research and performance enhancement.Machine learning(ML),with its capability to solve intricate tasks and perform robust data processing,is now catalyzing a revolutionary transformation in the development of MIB materials and devices.In this review,we summarize the utilization of ML algorithms that have expedited research on MIBs over the past five years.We present an extensive overview of existing algorithms,elucidating their details,advantages,and limitations in various applications,which encompass electrode screening,material property prediction,electrolyte formulation design,electrode material characterization,manufacturing parameter optimization,and real-time battery status monitoring.Finally,we propose potential solutions and future directions for the application of ML in advancing MIB development. 展开更多
关键词 Metal-ion battery machine learning Electrode materials CHARACTERIZATION Status evaluation
在线阅读 下载PDF
Accurate and efficient remaining useful life prediction of batteries enabled by physics-informed machine learning 被引量:1
8
作者 Liang Ma Jinpeng Tian +2 位作者 Tieling Zhang Qinghua Guo Chunsheng Hu 《Journal of Energy Chemistry》 SCIE EI CAS CSCD 2024年第4期512-521,共10页
The safe and reliable operation of lithium-ion batteries necessitates the accurate prediction of remaining useful life(RUL).However,this task is challenging due to the diverse ageing mechanisms,various operating condi... The safe and reliable operation of lithium-ion batteries necessitates the accurate prediction of remaining useful life(RUL).However,this task is challenging due to the diverse ageing mechanisms,various operating conditions,and limited measured signals.Although data-driven methods are perceived as a promising solution,they ignore intrinsic battery physics,leading to compromised accuracy,low efficiency,and low interpretability.In response,this study integrates domain knowledge into deep learning to enhance the RUL prediction performance.We demonstrate accurate RUL prediction using only a single charging curve.First,a generalisable physics-based model is developed to extract ageing-correlated parameters that can describe and explain battery degradation from battery charging data.The parameters inform a deep neural network(DNN)to predict RUL with high accuracy and efficiency.The trained model is validated under 3 types of batteries working under 7 conditions,considering fully charged and partially charged cases.Using data from one cycle only,the proposed method achieves a root mean squared error(RMSE)of 11.42 cycles and a mean absolute relative error(MARE)of 3.19%on average,which are over45%and 44%lower compared to the two state-of-the-art data-driven methods,respectively.Besides its accuracy,the proposed method also outperforms existing methods in terms of efficiency,input burden,and robustness.The inherent relationship between the model parameters and the battery degradation mechanism is further revealed,substantiating the intrinsic superiority of the proposed method. 展开更多
关键词 Lithium-ion batteries Remaining useful life Physics-informed machine learning
在线阅读 下载PDF
An Intelligent SDN-IoT Enabled Intrusion Detection System for Healthcare Systems Using a Hybrid Deep Learning and Machine Learning Approach 被引量:1
9
作者 R Arthi S Krishnaveni Sherali Zeadally 《China Communications》 SCIE CSCD 2024年第10期267-287,共21页
The advent of pandemics such as COVID-19 significantly impacts human behaviour and lives every day.Therefore,it is essential to make medical services connected to internet,available in every remote location during the... The advent of pandemics such as COVID-19 significantly impacts human behaviour and lives every day.Therefore,it is essential to make medical services connected to internet,available in every remote location during these situations.Also,the security issues in the Internet of Medical Things(IoMT)used in these service,make the situation even more critical because cyberattacks on the medical devices might cause treatment delays or clinical failures.Hence,services in the healthcare ecosystem need rapid,uninterrupted,and secure facilities.The solution provided in this research addresses security concerns and services availability for patients with critical health in remote areas.This research aims to develop an intelligent Software Defined Networks(SDNs)enabled secure framework for IoT healthcare ecosystem.We propose a hybrid of machine learning and deep learning techniques(DNN+SVM)to identify network intrusions in the sensor-based healthcare data.In addition,this system can efficiently monitor connected devices and suspicious behaviours.Finally,we evaluate the performance of our proposed framework using various performance metrics based on the healthcare application scenarios.the experimental results show that the proposed approach effectively detects and mitigates attacks in the SDN-enabled IoT networks and performs better that other state-of-art-approaches. 展开更多
关键词 deep neural network healthcare intrusion detection system IOT machine learning software-defined networks
在线阅读 下载PDF
Evolution of pore systems in low-maturity oil shales during thermal upgrading--Quantified by dynamic SEM and machine learning 被引量:2
10
作者 Jun Liu Xue Bai Derek Elsworth 《Petroleum Science》 SCIE EI CAS CSCD 2024年第3期1739-1750,共12页
In-situ upgrading by heating is feasible for low-maturity shale oil,where the pore space dynamically evolves.We characterize this response for a heated substrate concurrently imaged by SEM.We systematically follow the... In-situ upgrading by heating is feasible for low-maturity shale oil,where the pore space dynamically evolves.We characterize this response for a heated substrate concurrently imaged by SEM.We systematically follow the evolution of pore quantity,size(length,width and cross-sectional area),orientation,shape(aspect ratio,roundness and solidity)and their anisotropy—interpreted by machine learning.Results indicate that heating generates new pores in both organic matter and inorganic minerals.However,the newly formed pores are smaller than the original pores and thus reduce average lengths and widths of the bedding-parallel pore system.Conversely,the average pore lengths and widths are increased in the bedding-perpendicular direction.Besides,heating increases the cross-sectional area of pores in low-maturity oil shales,where this growth tendency fluctuates at<300℃ but becomes steady at>300℃.In addition,the orientation and shape of the newly-formed heating-induced pores follow the habit of the original pores and follow the initial probability distributions of pore orientation and shape.Herein,limited anisotropy is detected in pore direction and shape,indicating similar modes of evolution both bedding-parallel and bedding-normal.We propose a straightforward but robust model to describe evolution of pore system in low-maturity oil shales during heating. 展开更多
关键词 Low-maturity oil shale Pore elongation Organic matter pyrolysis In-situthermal upgrading Scanning electron microscopy(SEM) machine learning
在线阅读 下载PDF
Light-Activated Virtual Sensor Array with Machine Learning for Non-Invasive Diagnosis of Coronary Heart Disease 被引量:1
11
作者 Jiawang Hu Hao Qian +2 位作者 Sanyang Han Ping Zhang Yuan Lu 《Nano-Micro Letters》 SCIE EI CAS CSCD 2024年第12期427-448,共22页
Early non-invasive diagnosis of coronary heart disease(CHD)is critical.However,it is challenging to achieve accurate CHD diagnosis via detecting breath.In this work,heterostructured complexes of black phosphorus(BP)an... Early non-invasive diagnosis of coronary heart disease(CHD)is critical.However,it is challenging to achieve accurate CHD diagnosis via detecting breath.In this work,heterostructured complexes of black phosphorus(BP)and two-dimensional carbide and nitride(MXene)with high gas sensitivity and photo responsiveness were formulated using a self-assembly strategy.A light-activated virtual sensor array(LAVSA)based on BP/Ti_(3)C_(2)Tx was prepared under photomodulation and further assembled into an instant gas sensing platform(IGSP).In addition,a machine learning(ML)algorithm was introduced to help the IGSP detect and recognize the signals of breath samples to diagnose CHD.Due to the synergistic effect of BP and Ti_(3)C_(2)Tx as well as photo excitation,the synthesized heterostructured complexes exhibited higher performance than pristine Ti_(3)C_(2)Tx,with a response value 26%higher than that of pristine Ti_(3)C_(2)Tx.In addition,with the help of a pattern recognition algorithm,LAVSA successfully detected and identified 15 odor molecules affiliated with alcohols,ketones,aldehydes,esters,and acids.Meanwhile,with the assistance of ML,the IGSP achieved 69.2%accuracy in detecting the breath odor of 45 volunteers from healthy people and CHD patients.In conclusion,an immediate,low-cost,and accurate prototype was designed and fabricated for the noninvasive diagnosis of CHD,which provided a generalized solution for diagnosing other diseases and other more complex application scenarios. 展开更多
关键词 Black phosphorus/MXene heterostructures Light-activated virtual sensor array Diagnosis of coronary heart disease machine learning
在线阅读 下载PDF
基于Support Vector Machine和UPLC-QTOF-MS的人参生长年限数字化鉴定分析
12
作者 王献瑞 郭晓晗 +6 位作者 张宇 张佳婷 贺方良 荆文光 李明华 程显隆 魏锋 《中国现代中药》 CAS 2024年第12期2049-2055,共7页
目的:基于超高效液相色谱-四极杆飞行时间质谱法(UPLC-QTOF-MS)分析并经量化处理,结合支持向量机(SVM)进行数据建模,对人参生长年限进行数字化鉴定分析。方法:对3、4、5、15年生的人参样品进行UPLC-QTOF-MS分析,以混合质量控制样品为基... 目的:基于超高效液相色谱-四极杆飞行时间质谱法(UPLC-QTOF-MS)分析并经量化处理,结合支持向量机(SVM)进行数据建模,对人参生长年限进行数字化鉴定分析。方法:对3、4、5、15年生的人参样品进行UPLC-QTOF-MS分析,以混合质量控制样品为基准进行峰位校正、提取并经量化处理,获取反映化学成分信息的精确质量数-保留时间数据对(EMRT)。结合SVM进行数据建模,同时在5、10、20折内部交叉验证的基础上,通过准确率(Acc)、精确率(P)、曲线下面积(AUC)等参数进行模型评价。基于所建数据模型进行人参生长年限的鉴定。结果:经量化处理后80批人参均获得6556个EMRT,结合SVM建立的数据模型具有优秀的辨识效果,Acc、P及AUC均大于0.900且外部鉴定验证正确率为100%。结论:基于UPLC-QTOF-MS分析,并结合SVM算法能够高效准确地实现人参生长年限的数字化鉴定,可为中药材生长年限鉴定探索及中药质量控制提供参考。 展开更多
关键词 人参 生长年限 机器学习 支持向量机 数字化 超高效液相色谱-四极杆飞行时间质谱法
在线阅读 下载PDF
Physics-informed machine learning model for prediction of ground reflected wave peak overpressure
13
作者 Haoyu Zhang Yuxin Xu +1 位作者 Lihan Xiao Canjie Zhen 《Defence Technology(防务技术)》 SCIE EI CAS CSCD 2024年第11期119-133,共15页
The accurate prediction of peak overpressure of explosion shockwaves is significant in fields such as explosion hazard assessment and structural protection, where explosion shockwaves serve as typical destructive elem... The accurate prediction of peak overpressure of explosion shockwaves is significant in fields such as explosion hazard assessment and structural protection, where explosion shockwaves serve as typical destructive elements. Aiming at the problem of insufficient accuracy of the existing physical models for predicting the peak overpressure of ground reflected waves, two physics-informed machine learning models are constructed. The results demonstrate that the machine learning models, which incorporate physical information by predicting the deviation between the physical model and actual values and adding a physical loss term in the loss function, can accurately predict both the training and out-oftraining dataset. Compared to existing physical models, the average relative error in the predicted training domain is reduced from 17.459%-48.588% to 2%, and the proportion of average relative error less than 20% increased from 0% to 59.4% to more than 99%. In addition, the relative average error outside the prediction training set range is reduced from 14.496%-29.389% to 5%, and the proportion of relative average error less than 20% increased from 0% to 71.39% to more than 99%. The inclusion of a physical loss term enforcing monotonicity in the loss function effectively improves the extrapolation performance of machine learning. The findings of this study provide valuable reference for explosion hazard assessment and anti-explosion structural design in various fields. 展开更多
关键词 Blast shock wave Peak overpressure machine learning Physics-informed machine learning
在线阅读 下载PDF
Improved PSO-Extreme Learning Machine Algorithm for Indoor Localization
14
作者 Qiu Wanqing Zhang Qingmiao +1 位作者 Zhao Junhui Yang Lihua 《China Communications》 SCIE CSCD 2024年第5期113-122,共10页
Wi Fi and fingerprinting localization method have been a hot topic in indoor positioning because of their universality and location-related features.The basic assumption of fingerprinting localization is that the rece... Wi Fi and fingerprinting localization method have been a hot topic in indoor positioning because of their universality and location-related features.The basic assumption of fingerprinting localization is that the received signal strength indication(RSSI)distance is accord with the location distance.Therefore,how to efficiently match the current RSSI of the user with the RSSI in the fingerprint database is the key to achieve high-accuracy localization.In this paper,a particle swarm optimization-extreme learning machine(PSO-ELM)algorithm is proposed on the basis of the original fingerprinting localization.Firstly,we collect the RSSI of the experimental area to construct the fingerprint database,and the ELM algorithm is applied to the online stages to determine the corresponding relation between the location of the terminal and the RSSI it receives.Secondly,PSO algorithm is used to improve the bias and weight of ELM neural network,and the global optimal results are obtained.Finally,extensive simulation results are presented.It is shown that the proposed algorithm can effectively reduce mean error of localization and improve positioning accuracy when compared with K-Nearest Neighbor(KNN),Kmeans and Back-propagation(BP)algorithms. 展开更多
关键词 extreme learning machine fingerprinting localization indoor localization machine learning particle swarm optimization
在线阅读 下载PDF
Machine learning model based on non-convex penalized huberized-SVM
15
作者 Peng Wang Ji Guo Lin-Feng Li 《Journal of Electronic Science and Technology》 EI CAS CSCD 2024年第1期81-94,共14页
The support vector machine(SVM)is a classical machine learning method.Both the hinge loss and least absolute shrinkage and selection operator(LASSO)penalty are usually used in traditional SVMs.However,the hinge loss i... The support vector machine(SVM)is a classical machine learning method.Both the hinge loss and least absolute shrinkage and selection operator(LASSO)penalty are usually used in traditional SVMs.However,the hinge loss is not differentiable,and the LASSO penalty does not have the Oracle property.In this paper,the huberized loss is combined with non-convex penalties to obtain a model that has the advantages of both the computational simplicity and the Oracle property,contributing to higher accuracy than traditional SVMs.It is experimentally demonstrated that the two non-convex huberized-SVM methods,smoothly clipped absolute deviation huberized-SVM(SCAD-HSVM)and minimax concave penalty huberized-SVM(MCP-HSVM),outperform the traditional SVM method in terms of the prediction accuracy and classifier performance.They are also superior in terms of variable selection,especially when there is a high linear correlation between the variables.When they are applied to the prediction of listed companies,the variables that can affect and predict financial distress are accurately filtered out.Among all the indicators,the indicators per share have the greatest influence while those of solvency have the weakest influence.Listed companies can assess the financial situation with the indicators screened by our algorithm and make an early warning of their possible financial distress in advance with higher precision. 展开更多
关键词 Huberized loss machine learning Non-convex penalties Support vector machine(SVM)
在线阅读 下载PDF
Machine Learning for Signal Demodulation in Underwater Wireless Optical Communications
16
作者 Ma Shuai Yang Lei +6 位作者 Ding Wanying Li Hang Zhang Zhongdan Xu Jing Li Zongyan Xu Gang Li Shiyin 《China Communications》 SCIE CSCD 2024年第5期297-313,共17页
The underwater wireless optical communication(UWOC)system has gradually become essential to underwater wireless communication technology.Unlike other existing works on UWOC systems,this paper evaluates the proposed ma... The underwater wireless optical communication(UWOC)system has gradually become essential to underwater wireless communication technology.Unlike other existing works on UWOC systems,this paper evaluates the proposed machine learningbased signal demodulation methods through the selfbuilt experimental platform.Based on such a platform,we first construct a real signal dataset with ten modulation methods.Then,we propose a deep belief network(DBN)-based demodulator for feature extraction and multi-class feature classification.We also design an adaptive boosting(Ada Boost)demodulator as an alternative scheme without feature filtering for multiple modulated signals.Finally,it is demonstrated by extensive experimental results that the Ada Boost demodulator significantly outperforms the other algorithms.It also reveals that the demodulator accuracy decreases as the modulation order increases for a fixed received optical power.A higher-order modulation may achieve a higher effective transmission rate when the signal-to-noise ratio(SNR)is higher. 展开更多
关键词 ADABOOST DBN machine learning signal demodulation
在线阅读 下载PDF
Prediction of sepsis within 24 hours at the triage stage in emergency departments using machine learning
17
作者 Jingyuan Xie Jiandong Gao +8 位作者 Mutian Yang Ting Zhang Yecheng Liu Yutong Chen Zetong Liu Qimin Mei Zhimao Li Huadong Zhu Ji Wu 《World Journal of Emergency Medicine》 SCIE CAS CSCD 2024年第5期379-385,共7页
BACKGROUND:Sepsis is one of the main causes of mortality in intensive care units(ICUs).Early prediction is critical for reducing injury.As approximately 36%of sepsis occur within 24 h after emergency department(ED)adm... BACKGROUND:Sepsis is one of the main causes of mortality in intensive care units(ICUs).Early prediction is critical for reducing injury.As approximately 36%of sepsis occur within 24 h after emergency department(ED)admission in Medical Information Mart for Intensive Care(MIMIC-IV),a prediction system for the ED triage stage would be helpful.Previous methods such as the quick Sequential Organ Failure Assessment(qSOFA)are more suitable for screening than for prediction in the ED,and we aimed to fi nd a light-weight,convenient prediction method through machine learning.METHODS:We accessed the MIMIC-IV for sepsis patient data in the EDs.Our dataset comprised demographic information,vital signs,and synthetic features.Extreme Gradient Boosting(XGBoost)was used to predict the risk of developing sepsis within 24 h after ED admission.Additionally,SHapley Additive exPlanations(SHAP)was employed to provide a comprehensive interpretation of the model's results.Ten percent of the patients were randomly selected as the testing set,while the remaining patients were used for training with 10-fold cross-validation.RESULTS:For 10-fold cross-validation on 14,957 samples,we reached an accuracy of 84.1%±0.3%and an area under the receiver operating characteristic(ROC)curve of 0.92±0.02.The model achieved similar performance on the testing set of 1,662 patients.SHAP values showed that the fi ve most important features were acuity,arrival transportation,age,shock index,and respiratory rate.CONCLUSION:Machine learning models such as XGBoost may be used for sepsis prediction using only a small amount of data conveniently collected in the ED triage stage.This may help reduce workload in the ED and warn medical workers against the risk of sepsis in advance. 展开更多
关键词 SEPSIS machine learning Emergency department TRIAGE Informatics
在线阅读 下载PDF
Collective Molecular Machines: Multidimensionality and Reconfigurability
18
作者 Bin Wang Yuan Lu 《Nano-Micro Letters》 SCIE EI CAS CSCD 2024年第8期309-340,共32页
Molecular machines are key to cellular activity where they are involved in converting chemical and light energy into efficient mechanical work.During the last 60 years,designing molecular structures capable of generat... Molecular machines are key to cellular activity where they are involved in converting chemical and light energy into efficient mechanical work.During the last 60 years,designing molecular structures capable of generating unidirectional mechanical motion at the nanoscale has been the topic of intense research.Effective progress has been made,attributed to advances in various fields such as supramolecular chemistry,biology and nanotechnology,and informatics.However,individual molecular machines are only capable of producing nanometer work and generally have only a single functionality.In order to address these problems,collective behaviors realized by integrating several or more of these individual mechanical units in space and time have become a new paradigm.In this review,we comprehensively discuss recent developments in the collective behaviors of molecular machines.In particular,collective behavior is divided into two paradigms.One is the appropriate integration of molecular machines to efficiently amplify molecular motions and deformations to construct novel functional materials.The other is the construction of swarming modes at the supramolecular level to perform nanoscale or microscale operations.We discuss design strategies for both modes and focus on the modulation of features and properties.Subsequently,in order to address existing challenges,the idea of transferring experience gained in the field of micro/nano robotics is presented,offering prospects for future developments in the collective behavior of molecular machines. 展开更多
关键词 Molecular machines Collective control Collective behaviors DNA Biomolecular motors
在线阅读 下载PDF
The prediction of donor number and acceptor number of electrolyte solvent molecules based on machine learning
19
作者 Huaping Hu Yuqing Shan +3 位作者 Qiming Zhao Jinglun Wang Lingjun Wu Wanqiang Liu 《Journal of Energy Chemistry》 SCIE EI CAS CSCD 2024年第11期374-382,共9页
Electrolyte solvents have a critical impact on the design of high performance and safe batteries.Gutmann's donor number(DN) and acceptor number(AN) values are two important parameters to screen and design superior... Electrolyte solvents have a critical impact on the design of high performance and safe batteries.Gutmann's donor number(DN) and acceptor number(AN) values are two important parameters to screen and design superior electrolyte solvents. However, it is more time-consuming and expensive to obtain DN and AN values through experimental measurements. Therefore, it is essential to develop a method to predict DN and AN values. This paper presented the prediction models for DN and AN based on molecular structure descriptors of solvents, using four machine learning algorithms such as Cat Boost(Categorical Boosting), GBRT(Gradient Boosting Regression Tree), RF(Random Forest) and RR(Ridge Regression).The results showed that the DN and AN prediction models based on Cat Boost algorithm possesses satisfactory prediction ability, with R^(2) values of the testing set are 0.860 and 0.96. Moreover, the study analyzed the molecular structure parameters that impact DN and AN. The results indicated that TDB02m(3D Topological distance based descriptors-lag 2 weighted by mass) had a significant effect on DN, while HATS1s(leverage-weighted autocorrelation of lag 1/weighted by I-state) plays an important role in AN. The work provided an efficient approach for accurately predicting DN and AN values, which is useful for screening and designing electrolyte solvents. 展开更多
关键词 machine learning Donor number Acceptor number Electrolyte solvents
在线阅读 下载PDF
Thermal conductivity of GeTe crystals based on machine learning potentials
20
作者 张健 张昊春 +1 位作者 李伟峰 张刚 《Chinese Physics B》 SCIE EI CAS CSCD 2024年第4期104-107,共4页
GeTe has attracted extensive research interest for thermoelectric applications.In this paper,we first train a neuroevolution potential(NEP)based on a dataset constructed by ab initio molecular dynamics,with the Gaussi... GeTe has attracted extensive research interest for thermoelectric applications.In this paper,we first train a neuroevolution potential(NEP)based on a dataset constructed by ab initio molecular dynamics,with the Gaussian approximation potential(GAP)as a reference.The phonon density of states is then calculated by two machine learning potentials and compared with density functional theory results,with the GAP potential having higher accuracy.Next,the thermal conductivity of a GeTe crystal at 300 K is calculated by the equilibrium molecular dynamics method using both machine learning potentials,and both of them are in good agreement with the experimental results;however,the calculation speed when using the NEP potential is about 500 times faster than when using the GAP potential.Finally,the lattice thermal conductivity in the range of 300 K-600 K is calculated using the NEP potential.The lattice thermal conductivity decreases as the temperature increases due to the phonon anharmonic effect.This study provides a theoretical tool for the study of the thermal conductivity of GeTe. 展开更多
关键词 machine learning potentials thermal conductivity molecular dynamics
在线阅读 下载PDF
上一页 1 2 250 下一页 到第
使用帮助 返回顶部