An integrated evaluation system under randomness and fuzziness was developed in this work to systematically assess the risk of groundwater contamination in a little town, Central China. In this system, randomness of t...An integrated evaluation system under randomness and fuzziness was developed in this work to systematically assess the risk of groundwater contamination in a little town, Central China. In this system, randomness of the parameters and the fuzziness of the risk were considered simultaneously, and the exceeding standard probability of contamination and human health risk due to the contamination were integrated. The contamination risk was defined as a combination of "vulnerability" and "hazard". To calculate the value of "vulnerability", pollutant concentration was simulated by MODFLOW with random input variables and a new modified health risk assessment(MRA) model was established to analyze the level of "hazard". The limit concentration based on environmental-guideline and health risk due to manganese were systematically examined to obtain the general risk levels through a fuzzy rule base. The "vulnerability" and "hazard" were divided into five categories of "high", "medium-high", "medium", "low-medium" and "low", respectively. Then, "vulnerability" and "hazard" were firstly combined by integrated evaluation. Compared with the other two scenarios under deterministic methods, the risk obtained in the proposed system is higher. This research illustrated that ignoring of uncertainties in evaluation process might underestimate the risk level.展开更多
In this paper,large deviations principle(LDP)and moderate deviations principle(MDP)of record numbers in random walks are studied under certain conditions.The results show that the rate functions of LDP and MDP are dif...In this paper,large deviations principle(LDP)and moderate deviations principle(MDP)of record numbers in random walks are studied under certain conditions.The results show that the rate functions of LDP and MDP are different from those of weak record numbers,which are interesting complements of the conclusions by Li and Yao[1].展开更多
The complete convergence for weighted sums of sequences of independent,identically distributed random variables under sublinear expectation space is studied.By moment inequality and truncation methods,we establish the...The complete convergence for weighted sums of sequences of independent,identically distributed random variables under sublinear expectation space is studied.By moment inequality and truncation methods,we establish the equivalent conditions of complete convergence for weighted sums of sequences of independent,identically distributed random variables under sublinear expectation space.The results complement the corresponding results in probability space to those for sequences of independent,identically distributed random variables under sublinear expectation space.展开更多
In this paper,by utilizing the Marcinkiewicz-Zygmund inequality and Rosenthal-type inequality of negatively superadditive dependent(NSD)random arrays and truncated method,we investigate the complete f-moment convergen...In this paper,by utilizing the Marcinkiewicz-Zygmund inequality and Rosenthal-type inequality of negatively superadditive dependent(NSD)random arrays and truncated method,we investigate the complete f-moment convergence of NSD random variables.We establish and improve a general result on the complete f-moment convergence for Sung’s type randomly weighted sums of NSD random variables under some general assumptions.As an application,we show the complete consistency for the randomly weighted estimator in a nonparametric regression model based on NSD errors.展开更多
Creutzfeldt-Jakob disease(CJD)is a rare neurodegenerative disorder characterized by abnormalities in the prion protein(PrP),the most common form of human prion disease.Although Genome-Wide Association Studies(GWAS)hav...Creutzfeldt-Jakob disease(CJD)is a rare neurodegenerative disorder characterized by abnormalities in the prion protein(PrP),the most common form of human prion disease.Although Genome-Wide Association Studies(GWAS)have identified numerous risk genes for CJD,the mechanisms underlying these risk loci remain poorly understood.This study aims to elucidate novel genetically prioritized candidate proteins associated with CJD in the human brain through an integrative analytical pipeline.Utilizing datasets from Protein Quantitative Trait Loci(pQTL)(NpQTL1=152,NpQTL2=376),expression QTL(eQTL)(N=452),and the CJD GWAS(NCJD=4110,NControls=13569),we implemented a systematic analytical pipeline.This pipeline included Proteome-Wide Association Study(PWAS),Mendelian randomization(MR),Bayesian colocalization,and Transcriptome-Wide Association Study(TWAS)to identify novel genetically prioritized candidate proteins implicated in CJD pathogenesis within the brain.Through PWAS,we identified that the altered abundance of six brain proteins was significantly associated with CJD.Two genes,STX6 and PDIA4,were established as lead causal genes for CJD,supported by robust evidence(False Discovery Rate<0.05 in MR analysis;PP4/(PP3+PP4)≥0.75 in Bayesian colocalization).Specifically,elevated levels of STX6 and PDIA4 were associated with an increased risk of CJD.Additionally,TWAS demonstrated that STX6 and PDIA4 were associated with CJD at the transcriptional level.展开更多
This paper proposes a longitudinal vulnerability-based analysis method to evaluate the impact of foundation pit excavation on shield tunnels,accounting for geological uncertainties.First,the shield tunnel is modeled a...This paper proposes a longitudinal vulnerability-based analysis method to evaluate the impact of foundation pit excavation on shield tunnels,accounting for geological uncertainties.First,the shield tunnel is modeled as an Euler Bernoulli beam resting on the Pasternak foundation incorporating variability in subgrade parameters along the tunnel’s length.A random analysis method using random field theory is introduced to evaluate the tunnel’s longitudinal responses to excavation.Next,a risk assessment index system is established.The normalized relative depth between the excavation and the shield tunnel is used as a risk index,while the maximum longitudinal deformation,the maximum circumferential opening,and the maximum longitudinal bending moment serve as performance indicators.Based on these,a method for analyzing the longitudinal fragility of shield tunnels under excavation-induced disturbances is proposed.Finally,the technique is applied to a case study involving a foundation pit excavation above a shield tunnel,which is the primary application scenario of this method.Vulnerability curves for different performance indicators are derived,and the effects of tunnel stiffness and subgrade stiffness on the tunnel vulnerability are explored.The results reveal significant differences in vulnerability curves depending on the performance index used.Compared to the maximum circumferential opening and the maximum longitudinal bending moment,selecting the maximum longitudinal deformation as the control index better ensures the tunnel’s usability and safety under excavation disturbances.The longitudinal vulnerability of the shield tunnel nonlinearly decreases with the increase of the tunnel stiffness and subgrade stiffness,and the subgrade stiffness has a more pronounced effect.Parametric analyses suggest that actively reinforcing the substratum is more effective on reducing the risk of tunnel failure due to adjacent excavations than passive reinforcement of the tunnel structure.展开更多
A typical Whipple shield consists of double-layered plates with a certain gap.The space debris impacts the outer plate and is broken into a debris cloud(shattered,molten,vaporized)with dispersed energy and momentum,wh...A typical Whipple shield consists of double-layered plates with a certain gap.The space debris impacts the outer plate and is broken into a debris cloud(shattered,molten,vaporized)with dispersed energy and momentum,which reduces the risk of penetrating the bulkhead.In the realm of hypervelocity impact,strain rate(>10^(5)s^(-1))effects are negligible,and fluid dynamics is employed to describe the impact process.Efficient numerical tools for precisely predicting the damage degree can greatly accelerate the design and optimization of advanced protective structures.Current hypervelocity impact research primarily focuses on the interaction between projectile and front plate and the movement of debris cloud.However,the damage mechanism of debris cloud impacts on rear plates-the critical threat component-remains underexplored owing to complex multi-physics processes and prohibitive computational costs.Existing approaches,ranging from semi-empirical equations to a machine learningbased ballistic limit prediction method,are constrained to binary penetration classification.Alternatively,the uneven data from experiments and simulations caused these methods to be ineffective when the projectile has irregular shapes and complicate flight attitude.Therefore,it is urgent to develop a new damage prediction method for predicting the rear plate damage,which can help to gain a deeper understanding of the damage mechanism.In this study,a machine learning(ML)method is developed to predict the damage distribution in the rear plate.Based on the unit velocity space,the discretized information of debris cloud and rear plate damage from rare simulation cases is used as input data for training the ML models,while the generalization ability for damage distribution prediction is tested by other simulation cases with different attack angles.The results demonstrate that the training and prediction accuracies using the Random Forest(RF)algorithm significantly surpass those using Artificial Neural Networks(ANNs)and Support Vector Machine(SVM).The RF-based model effectively identifies damage features in sparsely distributed debris cloud and cumulative effect.This study establishes an expandable new dataset that accommodates additional parameters to improve the prediction accuracy.Results demonstrate the model's ability to overcome data imbalance limitations through debris cloud features,enabling rapid and accurate rear plate damage prediction across wider scenarios with minimal data requirements.展开更多
Architecture framework has become an effective method recently to describe the system of systems(SoS)architecture,such as the United States(US)Department of Defense Architecture Framework Version 2.0(DoDAF2.0).As a vi...Architecture framework has become an effective method recently to describe the system of systems(SoS)architecture,such as the United States(US)Department of Defense Architecture Framework Version 2.0(DoDAF2.0).As a viewpoint in DoDAF2.0,the operational viewpoint(OV)describes operational activities,nodes,and resource flows.The OV models are important for SoS architecture development.However,as the SoS complexity increases,constructing OV models with traditional methods exposes shortcomings,such as inefficient data collection and low modeling standards.Therefore,we propose an intelligent modeling method for five OV models,including operational resource flow OV-2,organizational relationships OV-4,operational activity hierarchy OV-5a,operational activities model OV-5b,and operational activity sequences OV-6c.The main idea of the method is to extract OV architecture data from text and generate interoperable OV models.First,we construct the OV meta model based on the DoDAF2.0 meta model(DM2).Second,OV architecture named entities is recognized from text based on the bidirectional long short-term memory and conditional random field(BiLSTM-CRF)model.And OV architecture relationships are collected with relationship extraction rules.Finally,we define the generation rules for OV models and develop an OV modeling tool.We use unmanned surface vehicles(USV)swarm target defense SoS architecture as a case to verify the feasibility and effectiveness of the intelligent modeling method.展开更多
In the mining industry,precise forecasting of rock fragmentation is critical for optimising blasting processes.In this study,we address the challenge of enhancing rock fragmentation assessment by developing a novel hy...In the mining industry,precise forecasting of rock fragmentation is critical for optimising blasting processes.In this study,we address the challenge of enhancing rock fragmentation assessment by developing a novel hybrid predictive model named GWO-RF.This model combines the grey wolf optimization(GWO)algorithm with the random forest(RF)technique to predict the D_(80)value,a critical parameter in evaluating rock fragmentation quality.The study is conducted using a dataset from Sarcheshmeh Copper Mine,employing six different swarm sizes for the GWO-RF hybrid model construction.The GWO-RF model’s hyperparameters are systematically optimized within established bounds,and its performance is rigorously evaluated using multiple evaluation metrics.The results show that the GWO-RF hybrid model has higher predictive skills,exceeding traditional models in terms of accuracy.Furthermore,the interpretability of the GWO-RF model is enhanced through the utilization of SHapley Additive exPlanations(SHAP)values.The insights gained from this research contribute to optimizing blasting operations and rock fragmentation outcomes in the mining industry.展开更多
Rockburst is a common geological disaster in underground engineering,which seriously threatens the safety of personnel,equipment and property.Utilizing machine learning models to evaluate risk of rockburst is graduall...Rockburst is a common geological disaster in underground engineering,which seriously threatens the safety of personnel,equipment and property.Utilizing machine learning models to evaluate risk of rockburst is gradually becoming a trend.In this study,the integrated algorithms under Gradient Boosting Decision Tree(GBDT)framework were used to evaluate and classify rockburst intensity.First,a total of 301 rock burst data samples were obtained from a case database,and the data were preprocessed using synthetic minority over-sampling technique(SMOTE).Then,the rockburst evaluation models including GBDT,eXtreme Gradient Boosting(XGBoost),Light Gradient Boosting Machine(LightGBM),and Categorical Features Gradient Boosting(CatBoost)were established,and the optimal hyperparameters of the models were obtained through random search grid and five-fold cross-validation.Afterwards,use the optimal hyperparameter configuration to fit the evaluation models,and analyze these models using test set.In order to evaluate the performance,metrics including accuracy,precision,recall,and F1-score were selected to analyze and compare with other machine learning models.Finally,the trained models were used to conduct rock burst risk assessment on rock samples from a mine in Shanxi Province,China,and providing theoretical guidance for the mine's safe production work.The models under the GBDT framework perform well in the evaluation of rockburst levels,and the proposed methods can provide a reliable reference for rockburst risk level analysis and safety management.展开更多
Objective:The causal relationship between eczema and autoimmune diseases has not been previously reported.This study aims to evaluate the causal relationship between eczema and autoimmune diseases.Methods:The two‐sam...Objective:The causal relationship between eczema and autoimmune diseases has not been previously reported.This study aims to evaluate the causal relationship between eczema and autoimmune diseases.Methods:The two‐sample Mendelian randomization(MR)method was used to assess the causal effect of eczema on autoimmune diseases.Summary data from the Genome-Wide Association Study Catalog(GWAS)were obtained from the Integrative Epidemiology Unit(IEU)database.For eczema and autoimmune diseases,genetic instrument variants(GIVs)were identified according to the significant difference(P<5×10−8).Causal effect estimates were generated using the inverse‐variance weighted(IVW)method.MR Egger,maximum likelihood,MR-PRESSO,and MR-RAPS methods were used for alternative analyses.Sensitivity tests,including heterogeneity,horizontal pleiotropy,and leave-one-out analyses,were performed.Finally,reverse causality was assessed.Results:Genetic susceptibility to eczema was associated with an increased risk of Crohn’s disease(OR=1.444,95%CI 1.199 to 1.738,P<0.001)and ulcerative colitis(OR=1.002,95%CI 1.001 to 1.003,P=0.002).However,no causal relationship was found for the other 6 autoimmune diseases,including systemic lupus erythematosus(SLE)(OR=0.932,P=0.401),bullous pemphigoid(BP)(OR=1.191,P=0.642),vitiligo(OR=1.000,P=0.327),multiple sclerosis(MS)(OR=1.000,P=0.965),ankylosing spondylitis(AS)(OR=1.001,P=0.121),rheumatoid arthritis(RA)(OR=1.000,P=0.460).Additionally,no reverse causal relationship was found between autoimmune diseases and eczema.Conclusion:Eczema is associated with an increased risk of Crohn’s disease and ulcerative colitis.No causal relationship is found between eczema and SLE,MS,AS,RA,BP,or vitiligo.展开更多
In this paper,we investigate the complete convergence and complete moment conver-gence for weighted sums of arrays of rowwise asymptotically negatively associated(ANA)random variables,without assuming identical distri...In this paper,we investigate the complete convergence and complete moment conver-gence for weighted sums of arrays of rowwise asymptotically negatively associated(ANA)random variables,without assuming identical distribution.The obtained results not only extend those of An and Yuan[1]and Shen et al.[2]to the case of ANA random variables,but also partially improve them.展开更多
With the increasing complexity of production processes,there has been a growing focus on online algorithms within the domain of multivariate statistical process control(SPC).Nonetheless,conventional methods,based on t...With the increasing complexity of production processes,there has been a growing focus on online algorithms within the domain of multivariate statistical process control(SPC).Nonetheless,conventional methods,based on the assumption of complete data obtained at uniform time intervals,exhibit suboptimal performance in the presence of missing data.In our pursuit of maximizing available information,we propose an adaptive exponentially weighted moving average(EWMA)control chart employing a weighted imputation approach that leverages the relationships between complete and incomplete data.Specifically,we introduce two recovery methods:an improved K-Nearest Neighbors imputing value and the conventional univariate EWMA statistic.We then formulate an adaptive weighting function to amalgamate these methods,assigning a diminished weight to the EWMA statistic when the sample information suggests an increased likelihood of the process being out of control,and vice versa.The robustness and sensitivity of the proposed scheme are shown through simulation results and an illustrative example.展开更多
In this paper,an efficient unequal error protection(UEP)scheme for online fountain codes is proposed.In the buildup phase,the traversing-selection strategy is proposed to select the most important symbols(MIS).Then,in...In this paper,an efficient unequal error protection(UEP)scheme for online fountain codes is proposed.In the buildup phase,the traversing-selection strategy is proposed to select the most important symbols(MIS).Then,in the completion phase,the weighted-selection strategy is applied to provide low overhead.The performance of the proposed scheme is analyzed and compared with the existing UEP online fountain scheme.Simulation results show that in terms of MIS and the least important symbols(LIS),when the bit error ratio is 10-4,the proposed scheme can achieve 85%and 31.58%overhead reduction,respectively.展开更多
The article introduces a finite element procedure using the bilinear quadrilateral element or four-node rectangular element(namely Q4 element) based on a refined first-order shear deformation theory(rFSDT) and Monte C...The article introduces a finite element procedure using the bilinear quadrilateral element or four-node rectangular element(namely Q4 element) based on a refined first-order shear deformation theory(rFSDT) and Monte Carlo simulation(MCS), so-called refined stochastic finite element method to investigate the random vibration of functionally graded material(FGM) plates subjected to the moving load.The advantage of the proposed method is to use r-FSDT to improve the accuracy of classical FSDT, satisfy the stress-free condition at the plate boundaries, and combine with MCS to analyze the vibration of the FGM plate when the parameter inputs are random quantities following a normal distribution. The obtained results show that the distribution characteristics of the vibration response of the FGM plate depend on the standard deviation of the input parameters and the velocity of the moving load.Furthermore, the numerical results in this study are expected to contribute to improving the understanding of FGM plates subjected to moving loads with uncertain input parameters.展开更多
This study focuses on the improvement of path planning efficiency for underwater gravity-aided navigation.Firstly,a Depth Sorting Fast Search(DSFS)algorithm was proposed to improve the planning speed of the Quick Rapi...This study focuses on the improvement of path planning efficiency for underwater gravity-aided navigation.Firstly,a Depth Sorting Fast Search(DSFS)algorithm was proposed to improve the planning speed of the Quick Rapidly-exploring Random Trees*(Q-RRT*)algorithm.A cost inequality relationship between an ancestor and its descendants was derived,and the ancestors were filtered accordingly.Secondly,the underwater gravity-aided navigation path planning system was designed based on the DSFS algorithm,taking into account the fitness,safety,and asymptotic optimality of the routes,according to the gravity suitability distribution of the navigation space.Finally,experimental comparisons of the computing performance of the ChooseParent procedure,the Rewire procedure,and the combination of the two procedures for Q-RRT*and DSFS were conducted under the same planning environment and parameter conditions,respectively.The results showed that the computational efficiency of the DSFS algorithm was improved by about 1.2 times compared with the Q-RRT*algorithm while ensuring correct computational results.展开更多
Mendelian randomization(MR)is widely used in causal mediation analysis to control unmeasured confounding effects,which is valid under some strong assumptions.It is thus of great interest to assess the impact of violat...Mendelian randomization(MR)is widely used in causal mediation analysis to control unmeasured confounding effects,which is valid under some strong assumptions.It is thus of great interest to assess the impact of violations of these MR assumptions through sensitivity analysis.Sensitivity analyses have been conducted for simple MR-based causal average effect analyses,but they are not available for MR-based mediation analysis studies,and we aim to fill this gap in this paper.We propose to use two sensitivity parameters to quantify the effect due to the deviation of the IV assumptions.With these two sensitivity parameters,we derive consistent indirect causal effect estimators and establish their asymptotic propersties.Our theoretical results can be used in MR-based mediation analysis to study the impact of violations of MR as-sumptions.The finite sample performance of the proposed method is illustrated through simulation studies,sensitivity ana-lysis,and application to a real genome-wide association study.展开更多
Many digital platforms have employed free-content promotion strategies to deal with the high uncertainty levels regarding digital content products.However,the diversity of digital content products and user heterogenei...Many digital platforms have employed free-content promotion strategies to deal with the high uncertainty levels regarding digital content products.However,the diversity of digital content products and user heterogeneity in content preference may blur the impact of platform promotions across users and products.Therefore,free-content promotion strategies should be adapted to allocate marketing resources optimally and increase revenue.This study develops personal-ized free-content promotion strategies based on individual-level heterogeneous treatment effects and explores the causes of their heterogeneity,focusing on the moderating effect of user engagement-related variables.To this end,we utilize ran-dom field experimental data provided by a top Chinese e-book platform.We employ a framework that combines machine learning with econometric causal inference methods to estimate individual treatment effects and analyze their potential mechanisms.The analysis shows that,on average,free-content promotions lead to a significant increase in consumer pay-ments.However,the higher the level of user engagement,the lower the payment lift caused by promotions,as more-engaged users are more strongly affected by the cannibalization effect of free-content promotion.This study introduces a novel causal research design to help platforms improve their marketing strategies.展开更多
Let X=Σ_(i=1)^(n)a_(i)ξ_(i)be a Rademacher sum with Var(X)=1 and Z be a standard normal random variable.This paper concerns the upper bound of|P(X≤x)−P(Z≤x)|for any x∈R.Using the symmetric properties and R softwa...Let X=Σ_(i=1)^(n)a_(i)ξ_(i)be a Rademacher sum with Var(X)=1 and Z be a standard normal random variable.This paper concerns the upper bound of|P(X≤x)−P(Z≤x)|for any x∈R.Using the symmetric properties and R software,this paper gets the following improved Berry-Esseen type bound under some conditions,|P(X≤x)−P(Z≤x)|≤P(Z∈(0,a1)),∀x∈R,which is one of the modified conjecture proposed by Nathan K.and Ohad K.展开更多
As the“engine”of equipment continuous operation and repeated operation, equipment maintenance support plays a more prominent role in the confrontation of symmetrical combat systems. As the basis and guide for the pl...As the“engine”of equipment continuous operation and repeated operation, equipment maintenance support plays a more prominent role in the confrontation of symmetrical combat systems. As the basis and guide for the planning and implementation of equipment maintenance tasks, the equipment damage measurement is an important guarantee for the effective implementation of maintenance support. Firstly,this article comprehensively analyses the influence factors to damage measurement from the enemy’s attributes, our attributes and the battlefield environment starting from the basic problem of wartime equipment damage measurement. Secondly, this article determines the key factors based on fuzzy comprehensive evaluation(FCE) and performed principal component analysis (PCA) on the key factors. Finally, the principal components representing more than 85%of the data features are taken as the input and the equipment damage quantity is taken as the output. The data are trained and tested by artificial neural network (ANN) and random forest (RF). In a word, FCE-PCA-RF can be used as a reference for the research of equipment damage estimation in wartime.展开更多
基金Projects(51039001,51009063) supported by the National Natural Science Foundation of ChinaProject(SX2010-026) supported by State Council Three Gorges Project Construction Committee Executive Office,China+1 种基金Project(2012BS046) supported by Henan University of Technology,ChinaProject(BYHGLC-2010-02) supported by the Guangzhou Water Authority,China
文摘An integrated evaluation system under randomness and fuzziness was developed in this work to systematically assess the risk of groundwater contamination in a little town, Central China. In this system, randomness of the parameters and the fuzziness of the risk were considered simultaneously, and the exceeding standard probability of contamination and human health risk due to the contamination were integrated. The contamination risk was defined as a combination of "vulnerability" and "hazard". To calculate the value of "vulnerability", pollutant concentration was simulated by MODFLOW with random input variables and a new modified health risk assessment(MRA) model was established to analyze the level of "hazard". The limit concentration based on environmental-guideline and health risk due to manganese were systematically examined to obtain the general risk levels through a fuzzy rule base. The "vulnerability" and "hazard" were divided into five categories of "high", "medium-high", "medium", "low-medium" and "low", respectively. Then, "vulnerability" and "hazard" were firstly combined by integrated evaluation. Compared with the other two scenarios under deterministic methods, the risk obtained in the proposed system is higher. This research illustrated that ignoring of uncertainties in evaluation process might underestimate the risk level.
基金supported by the National Natural Science Foundation of China(Grant No.11671145)the Science and Technology Commission of Shanghai Municipality(Grant No.18dz2271000).
文摘In this paper,large deviations principle(LDP)and moderate deviations principle(MDP)of record numbers in random walks are studied under certain conditions.The results show that the rate functions of LDP and MDP are different from those of weak record numbers,which are interesting complements of the conclusions by Li and Yao[1].
基金supported by Doctoral Scientific Research Starting Foundation of Jingdezhen Ceramic University(Grant No.102/01003002031)Re-accompanying Funding Project of Academic Achievements of Jingdezhen Ceramic University(Grant Nos.215/20506277,215/20506341)。
文摘The complete convergence for weighted sums of sequences of independent,identically distributed random variables under sublinear expectation space is studied.By moment inequality and truncation methods,we establish the equivalent conditions of complete convergence for weighted sums of sequences of independent,identically distributed random variables under sublinear expectation space.The results complement the corresponding results in probability space to those for sequences of independent,identically distributed random variables under sublinear expectation space.
基金supported by the National Social Science Fundation(Grant No.21BTJ040)the Project of Outstanding Young People in University of Anhui Province(Grant Nos.2023AH020037,SLXY2024A001).
文摘In this paper,by utilizing the Marcinkiewicz-Zygmund inequality and Rosenthal-type inequality of negatively superadditive dependent(NSD)random arrays and truncated method,we investigate the complete f-moment convergence of NSD random variables.We establish and improve a general result on the complete f-moment convergence for Sung’s type randomly weighted sums of NSD random variables under some general assumptions.As an application,we show the complete consistency for the randomly weighted estimator in a nonparametric regression model based on NSD errors.
文摘Creutzfeldt-Jakob disease(CJD)is a rare neurodegenerative disorder characterized by abnormalities in the prion protein(PrP),the most common form of human prion disease.Although Genome-Wide Association Studies(GWAS)have identified numerous risk genes for CJD,the mechanisms underlying these risk loci remain poorly understood.This study aims to elucidate novel genetically prioritized candidate proteins associated with CJD in the human brain through an integrative analytical pipeline.Utilizing datasets from Protein Quantitative Trait Loci(pQTL)(NpQTL1=152,NpQTL2=376),expression QTL(eQTL)(N=452),and the CJD GWAS(NCJD=4110,NControls=13569),we implemented a systematic analytical pipeline.This pipeline included Proteome-Wide Association Study(PWAS),Mendelian randomization(MR),Bayesian colocalization,and Transcriptome-Wide Association Study(TWAS)to identify novel genetically prioritized candidate proteins implicated in CJD pathogenesis within the brain.Through PWAS,we identified that the altered abundance of six brain proteins was significantly associated with CJD.Two genes,STX6 and PDIA4,were established as lead causal genes for CJD,supported by robust evidence(False Discovery Rate<0.05 in MR analysis;PP4/(PP3+PP4)≥0.75 in Bayesian colocalization).Specifically,elevated levels of STX6 and PDIA4 were associated with an increased risk of CJD.Additionally,TWAS demonstrated that STX6 and PDIA4 were associated with CJD at the transcriptional level.
基金Project(52178402) supported by the National Natural Science Foundation of China。
文摘This paper proposes a longitudinal vulnerability-based analysis method to evaluate the impact of foundation pit excavation on shield tunnels,accounting for geological uncertainties.First,the shield tunnel is modeled as an Euler Bernoulli beam resting on the Pasternak foundation incorporating variability in subgrade parameters along the tunnel’s length.A random analysis method using random field theory is introduced to evaluate the tunnel’s longitudinal responses to excavation.Next,a risk assessment index system is established.The normalized relative depth between the excavation and the shield tunnel is used as a risk index,while the maximum longitudinal deformation,the maximum circumferential opening,and the maximum longitudinal bending moment serve as performance indicators.Based on these,a method for analyzing the longitudinal fragility of shield tunnels under excavation-induced disturbances is proposed.Finally,the technique is applied to a case study involving a foundation pit excavation above a shield tunnel,which is the primary application scenario of this method.Vulnerability curves for different performance indicators are derived,and the effects of tunnel stiffness and subgrade stiffness on the tunnel vulnerability are explored.The results reveal significant differences in vulnerability curves depending on the performance index used.Compared to the maximum circumferential opening and the maximum longitudinal bending moment,selecting the maximum longitudinal deformation as the control index better ensures the tunnel’s usability and safety under excavation disturbances.The longitudinal vulnerability of the shield tunnel nonlinearly decreases with the increase of the tunnel stiffness and subgrade stiffness,and the subgrade stiffness has a more pronounced effect.Parametric analyses suggest that actively reinforcing the substratum is more effective on reducing the risk of tunnel failure due to adjacent excavations than passive reinforcement of the tunnel structure.
基金supported by National Natural Science Foundation of China(Grant No.12432018,12372346)the Innovative Research Groups of the National Natural Science Foundation of China(Grant No.12221002).
文摘A typical Whipple shield consists of double-layered plates with a certain gap.The space debris impacts the outer plate and is broken into a debris cloud(shattered,molten,vaporized)with dispersed energy and momentum,which reduces the risk of penetrating the bulkhead.In the realm of hypervelocity impact,strain rate(>10^(5)s^(-1))effects are negligible,and fluid dynamics is employed to describe the impact process.Efficient numerical tools for precisely predicting the damage degree can greatly accelerate the design and optimization of advanced protective structures.Current hypervelocity impact research primarily focuses on the interaction between projectile and front plate and the movement of debris cloud.However,the damage mechanism of debris cloud impacts on rear plates-the critical threat component-remains underexplored owing to complex multi-physics processes and prohibitive computational costs.Existing approaches,ranging from semi-empirical equations to a machine learningbased ballistic limit prediction method,are constrained to binary penetration classification.Alternatively,the uneven data from experiments and simulations caused these methods to be ineffective when the projectile has irregular shapes and complicate flight attitude.Therefore,it is urgent to develop a new damage prediction method for predicting the rear plate damage,which can help to gain a deeper understanding of the damage mechanism.In this study,a machine learning(ML)method is developed to predict the damage distribution in the rear plate.Based on the unit velocity space,the discretized information of debris cloud and rear plate damage from rare simulation cases is used as input data for training the ML models,while the generalization ability for damage distribution prediction is tested by other simulation cases with different attack angles.The results demonstrate that the training and prediction accuracies using the Random Forest(RF)algorithm significantly surpass those using Artificial Neural Networks(ANNs)and Support Vector Machine(SVM).The RF-based model effectively identifies damage features in sparsely distributed debris cloud and cumulative effect.This study establishes an expandable new dataset that accommodates additional parameters to improve the prediction accuracy.Results demonstrate the model's ability to overcome data imbalance limitations through debris cloud features,enabling rapid and accurate rear plate damage prediction across wider scenarios with minimal data requirements.
基金National Natural Science Foundation of China(71690233,71971213,71901214)。
文摘Architecture framework has become an effective method recently to describe the system of systems(SoS)architecture,such as the United States(US)Department of Defense Architecture Framework Version 2.0(DoDAF2.0).As a viewpoint in DoDAF2.0,the operational viewpoint(OV)describes operational activities,nodes,and resource flows.The OV models are important for SoS architecture development.However,as the SoS complexity increases,constructing OV models with traditional methods exposes shortcomings,such as inefficient data collection and low modeling standards.Therefore,we propose an intelligent modeling method for five OV models,including operational resource flow OV-2,organizational relationships OV-4,operational activity hierarchy OV-5a,operational activities model OV-5b,and operational activity sequences OV-6c.The main idea of the method is to extract OV architecture data from text and generate interoperable OV models.First,we construct the OV meta model based on the DoDAF2.0 meta model(DM2).Second,OV architecture named entities is recognized from text based on the bidirectional long short-term memory and conditional random field(BiLSTM-CRF)model.And OV architecture relationships are collected with relationship extraction rules.Finally,we define the generation rules for OV models and develop an OV modeling tool.We use unmanned surface vehicles(USV)swarm target defense SoS architecture as a case to verify the feasibility and effectiveness of the intelligent modeling method.
基金Projects(42177164,52474121)supported by the National Science Foundation of ChinaProject(PBSKL2023A12)supported by the State Key Laboratory of Precision Blasting and Hubei Key Laboratory of Blasting Engineering,China。
文摘In the mining industry,precise forecasting of rock fragmentation is critical for optimising blasting processes.In this study,we address the challenge of enhancing rock fragmentation assessment by developing a novel hybrid predictive model named GWO-RF.This model combines the grey wolf optimization(GWO)algorithm with the random forest(RF)technique to predict the D_(80)value,a critical parameter in evaluating rock fragmentation quality.The study is conducted using a dataset from Sarcheshmeh Copper Mine,employing six different swarm sizes for the GWO-RF hybrid model construction.The GWO-RF model’s hyperparameters are systematically optimized within established bounds,and its performance is rigorously evaluated using multiple evaluation metrics.The results show that the GWO-RF hybrid model has higher predictive skills,exceeding traditional models in terms of accuracy.Furthermore,the interpretability of the GWO-RF model is enhanced through the utilization of SHapley Additive exPlanations(SHAP)values.The insights gained from this research contribute to optimizing blasting operations and rock fragmentation outcomes in the mining industry.
基金Project(52161135301)supported by the International Cooperation and Exchange of the National Natural Science Foundation of ChinaProject(202306370296)supported by China Scholarship Council。
文摘Rockburst is a common geological disaster in underground engineering,which seriously threatens the safety of personnel,equipment and property.Utilizing machine learning models to evaluate risk of rockburst is gradually becoming a trend.In this study,the integrated algorithms under Gradient Boosting Decision Tree(GBDT)framework were used to evaluate and classify rockburst intensity.First,a total of 301 rock burst data samples were obtained from a case database,and the data were preprocessed using synthetic minority over-sampling technique(SMOTE).Then,the rockburst evaluation models including GBDT,eXtreme Gradient Boosting(XGBoost),Light Gradient Boosting Machine(LightGBM),and Categorical Features Gradient Boosting(CatBoost)were established,and the optimal hyperparameters of the models were obtained through random search grid and five-fold cross-validation.Afterwards,use the optimal hyperparameter configuration to fit the evaluation models,and analyze these models using test set.In order to evaluate the performance,metrics including accuracy,precision,recall,and F1-score were selected to analyze and compare with other machine learning models.Finally,the trained models were used to conduct rock burst risk assessment on rock samples from a mine in Shanxi Province,China,and providing theoretical guidance for the mine's safe production work.The models under the GBDT framework perform well in the evaluation of rockburst levels,and the proposed methods can provide a reliable reference for rockburst risk level analysis and safety management.
基金This work was supported by the National Natural Science Foundation (82273506,82273508)the Hunan Provincial Health Commission Scientific Research Plan Project (D202304128334),China。
文摘Objective:The causal relationship between eczema and autoimmune diseases has not been previously reported.This study aims to evaluate the causal relationship between eczema and autoimmune diseases.Methods:The two‐sample Mendelian randomization(MR)method was used to assess the causal effect of eczema on autoimmune diseases.Summary data from the Genome-Wide Association Study Catalog(GWAS)were obtained from the Integrative Epidemiology Unit(IEU)database.For eczema and autoimmune diseases,genetic instrument variants(GIVs)were identified according to the significant difference(P<5×10−8).Causal effect estimates were generated using the inverse‐variance weighted(IVW)method.MR Egger,maximum likelihood,MR-PRESSO,and MR-RAPS methods were used for alternative analyses.Sensitivity tests,including heterogeneity,horizontal pleiotropy,and leave-one-out analyses,were performed.Finally,reverse causality was assessed.Results:Genetic susceptibility to eczema was associated with an increased risk of Crohn’s disease(OR=1.444,95%CI 1.199 to 1.738,P<0.001)and ulcerative colitis(OR=1.002,95%CI 1.001 to 1.003,P=0.002).However,no causal relationship was found for the other 6 autoimmune diseases,including systemic lupus erythematosus(SLE)(OR=0.932,P=0.401),bullous pemphigoid(BP)(OR=1.191,P=0.642),vitiligo(OR=1.000,P=0.327),multiple sclerosis(MS)(OR=1.000,P=0.965),ankylosing spondylitis(AS)(OR=1.001,P=0.121),rheumatoid arthritis(RA)(OR=1.000,P=0.460).Additionally,no reverse causal relationship was found between autoimmune diseases and eczema.Conclusion:Eczema is associated with an increased risk of Crohn’s disease and ulcerative colitis.No causal relationship is found between eczema and SLE,MS,AS,RA,BP,or vitiligo.
基金National Natural Science Foundation of China (Grant Nos.12061028, 71871046)Support Program of the Guangxi China Science Foundation (Grant No.2018GXNSFAA281011)。
文摘In this paper,we investigate the complete convergence and complete moment conver-gence for weighted sums of arrays of rowwise asymptotically negatively associated(ANA)random variables,without assuming identical distribution.The obtained results not only extend those of An and Yuan[1]and Shen et al.[2]to the case of ANA random variables,but also partially improve them.
文摘With the increasing complexity of production processes,there has been a growing focus on online algorithms within the domain of multivariate statistical process control(SPC).Nonetheless,conventional methods,based on the assumption of complete data obtained at uniform time intervals,exhibit suboptimal performance in the presence of missing data.In our pursuit of maximizing available information,we propose an adaptive exponentially weighted moving average(EWMA)control chart employing a weighted imputation approach that leverages the relationships between complete and incomplete data.Specifically,we introduce two recovery methods:an improved K-Nearest Neighbors imputing value and the conventional univariate EWMA statistic.We then formulate an adaptive weighting function to amalgamate these methods,assigning a diminished weight to the EWMA statistic when the sample information suggests an increased likelihood of the process being out of control,and vice versa.The robustness and sensitivity of the proposed scheme are shown through simulation results and an illustrative example.
基金supported by the National Natural Science Foundation of China(61601147)the Beijing Natural Science Foundation(L182032)。
文摘In this paper,an efficient unequal error protection(UEP)scheme for online fountain codes is proposed.In the buildup phase,the traversing-selection strategy is proposed to select the most important symbols(MIS).Then,in the completion phase,the weighted-selection strategy is applied to provide low overhead.The performance of the proposed scheme is analyzed and compared with the existing UEP online fountain scheme.Simulation results show that in terms of MIS and the least important symbols(LIS),when the bit error ratio is 10-4,the proposed scheme can achieve 85%and 31.58%overhead reduction,respectively.
文摘The article introduces a finite element procedure using the bilinear quadrilateral element or four-node rectangular element(namely Q4 element) based on a refined first-order shear deformation theory(rFSDT) and Monte Carlo simulation(MCS), so-called refined stochastic finite element method to investigate the random vibration of functionally graded material(FGM) plates subjected to the moving load.The advantage of the proposed method is to use r-FSDT to improve the accuracy of classical FSDT, satisfy the stress-free condition at the plate boundaries, and combine with MCS to analyze the vibration of the FGM plate when the parameter inputs are random quantities following a normal distribution. The obtained results show that the distribution characteristics of the vibration response of the FGM plate depend on the standard deviation of the input parameters and the velocity of the moving load.Furthermore, the numerical results in this study are expected to contribute to improving the understanding of FGM plates subjected to moving loads with uncertain input parameters.
基金the National Natural Science Foundation of China(Grant No.42274119)the Liaoning Revitalization Talents Program(Grant No.XLYC2002082)+1 种基金National Key Research and Development Plan Key Special Projects of Science and Technology Military Civil Integration(Grant No.2022YFF1400500)the Key Project of Science and Technology Commission of the Central Military Commission.
文摘This study focuses on the improvement of path planning efficiency for underwater gravity-aided navigation.Firstly,a Depth Sorting Fast Search(DSFS)algorithm was proposed to improve the planning speed of the Quick Rapidly-exploring Random Trees*(Q-RRT*)algorithm.A cost inequality relationship between an ancestor and its descendants was derived,and the ancestors were filtered accordingly.Secondly,the underwater gravity-aided navigation path planning system was designed based on the DSFS algorithm,taking into account the fitness,safety,and asymptotic optimality of the routes,according to the gravity suitability distribution of the navigation space.Finally,experimental comparisons of the computing performance of the ChooseParent procedure,the Rewire procedure,and the combination of the two procedures for Q-RRT*and DSFS were conducted under the same planning environment and parameter conditions,respectively.The results showed that the computational efficiency of the DSFS algorithm was improved by about 1.2 times compared with the Q-RRT*algorithm while ensuring correct computational results.
基金This work was supported by the National Natural Science Foundation of China(12171451,72091212).
文摘Mendelian randomization(MR)is widely used in causal mediation analysis to control unmeasured confounding effects,which is valid under some strong assumptions.It is thus of great interest to assess the impact of violations of these MR assumptions through sensitivity analysis.Sensitivity analyses have been conducted for simple MR-based causal average effect analyses,but they are not available for MR-based mediation analysis studies,and we aim to fill this gap in this paper.We propose to use two sensitivity parameters to quantify the effect due to the deviation of the IV assumptions.With these two sensitivity parameters,we derive consistent indirect causal effect estimators and establish their asymptotic propersties.Our theoretical results can be used in MR-based mediation analysis to study the impact of violations of MR as-sumptions.The finite sample performance of the proposed method is illustrated through simulation studies,sensitivity ana-lysis,and application to a real genome-wide association study.
基金supported by the Anhui Postdoctoral Scientific Research Program Foundation(2022B579).
文摘Many digital platforms have employed free-content promotion strategies to deal with the high uncertainty levels regarding digital content products.However,the diversity of digital content products and user heterogeneity in content preference may blur the impact of platform promotions across users and products.Therefore,free-content promotion strategies should be adapted to allocate marketing resources optimally and increase revenue.This study develops personal-ized free-content promotion strategies based on individual-level heterogeneous treatment effects and explores the causes of their heterogeneity,focusing on the moderating effect of user engagement-related variables.To this end,we utilize ran-dom field experimental data provided by a top Chinese e-book platform.We employ a framework that combines machine learning with econometric causal inference methods to estimate individual treatment effects and analyze their potential mechanisms.The analysis shows that,on average,free-content promotions lead to a significant increase in consumer pay-ments.However,the higher the level of user engagement,the lower the payment lift caused by promotions,as more-engaged users are more strongly affected by the cannibalization effect of free-content promotion.This study introduces a novel causal research design to help platforms improve their marketing strategies.
基金supported by the National Natural Science Foundation of China(Grant No.11861029)the Hainan Provincial Natural Science Foundation of China(Grants Nos.122MS056,124MS056).
文摘Let X=Σ_(i=1)^(n)a_(i)ξ_(i)be a Rademacher sum with Var(X)=1 and Z be a standard normal random variable.This paper concerns the upper bound of|P(X≤x)−P(Z≤x)|for any x∈R.Using the symmetric properties and R software,this paper gets the following improved Berry-Esseen type bound under some conditions,|P(X≤x)−P(Z≤x)|≤P(Z∈(0,a1)),∀x∈R,which is one of the modified conjecture proposed by Nathan K.and Ohad K.
文摘As the“engine”of equipment continuous operation and repeated operation, equipment maintenance support plays a more prominent role in the confrontation of symmetrical combat systems. As the basis and guide for the planning and implementation of equipment maintenance tasks, the equipment damage measurement is an important guarantee for the effective implementation of maintenance support. Firstly,this article comprehensively analyses the influence factors to damage measurement from the enemy’s attributes, our attributes and the battlefield environment starting from the basic problem of wartime equipment damage measurement. Secondly, this article determines the key factors based on fuzzy comprehensive evaluation(FCE) and performed principal component analysis (PCA) on the key factors. Finally, the principal components representing more than 85%of the data features are taken as the input and the equipment damage quantity is taken as the output. The data are trained and tested by artificial neural network (ANN) and random forest (RF). In a word, FCE-PCA-RF can be used as a reference for the research of equipment damage estimation in wartime.