Existing blockwise empirical likelihood(BEL)method blocks the observations or their analogues,which is proven useful under some dependent data settings.In this paper,we introduce a new BEL(NBEL)method by blocking the ...Existing blockwise empirical likelihood(BEL)method blocks the observations or their analogues,which is proven useful under some dependent data settings.In this paper,we introduce a new BEL(NBEL)method by blocking the scoring functions under high dimensional cases.We study the construction of confidence regions for the parameters in spatial autoregressive models with spatial autoregressive disturbances(SARAR models)with high dimension of parameters by using the NBEL method.It is shown that the NBEL ratio statistics are asymptoticallyχ^(2)-type distributed,which are used to obtain the NBEL based confidence regions for the parameters in SARAR models.A simulation study is conducted to compare the performances of the NBEL and the usual EL methods.展开更多
Risk management often plays an important role in decision making un-der uncertainty.In quantitative risk management,assessing and optimizing risk metrics requires eficient computing techniques and reliable theoretical...Risk management often plays an important role in decision making un-der uncertainty.In quantitative risk management,assessing and optimizing risk metrics requires eficient computing techniques and reliable theoretical guarantees.In this pa-per,we introduce several topics on quantitative risk management and review some of the recent studies and advancements on the topics.We consider several risk metrics and study decision models that involve the metrics,with a main focus on the related com-puting techniques and theoretical properties.We show that stochastic optimization,as a powerful tool,can be leveraged to effectively address these problems.展开更多
Crocuta and Pachycrocuta are widely regarded as the most prevalent and emblematic hyenas across Eurasia during Quaternary.They are easily distinguished by their distinctive carnassial teeth.However,the disparities in ...Crocuta and Pachycrocuta are widely regarded as the most prevalent and emblematic hyenas across Eurasia during Quaternary.They are easily distinguished by their distinctive carnassial teeth.However,the disparities in non-carnassial elements are less pronounced and have received minimal attention in previous studies.This has resulted in erroneous identifications when dealing with fragmented specimens,particularly in cases where carnassial teeth are poorly preserved or absent.Such misidentifications have the potential to give rise to erroneous inferences regarding the paleozoogeography and biochronology of the animals in question.The bone-cracking hyena specimens from Huainan,Anhui(Tseng et al.,2008)are re-examined and re-evaluated here through a series of morphological comparisons and data analyses(univariate,bivariate,and multivariate analyses etc.).The results provide unequivocal confirmation that the specimens from Xiliexi are not spotted hyenas,but belong to Pachycrocuta perrieri instead.Conversely,the specimen from Dadingshan is the only genuine representative of the spotted hyena,which is supposed to be a possible earliest fossil record for Crocuta ultima thus far in China.Furthermore,the disparities in dentognathic morphology between Crocuta and Pachycrocuta are systematically summarized and analyzed,with an explanation of their eco-functional significance.The present study hypothesizes that Pachycrocuta retains a certain degree of active predatory capability,attributable to the robustness of the canine and the symphysis of the jaw,among other factors.This finding indicates that Pachycrocuta exhibits a form of flexible foraging behavior,combining opportunistic scavenging and active hunting in a manner analogous to Crocuta.Finally,the dynamic evolutionary history of hyenas in East China since Pleistocene has also been broadly reconstructed.展开更多
Federated learning(FL)is a distributed machine learning paradigm for edge cloud computing.FL can facilitate data-driven decision-making in tactical scenarios,effectively addressing both data volume and infrastructure ...Federated learning(FL)is a distributed machine learning paradigm for edge cloud computing.FL can facilitate data-driven decision-making in tactical scenarios,effectively addressing both data volume and infrastructure challenges in edge environments.However,the diversity of clients in edge cloud computing presents significant challenges for FL.Personalized federated learning(pFL)received considerable attention in recent years.One example of pFL involves exploiting the global and local information in the local model.Current pFL algorithms experience limitations such as slow convergence speed,catastrophic forgetting,and poor performance in complex tasks,which still have significant shortcomings compared to the centralized learning.To achieve high pFL performance,we propose FedCLCC:Federated Contrastive Learning and Conditional Computing.The core of FedCLCC is the use of contrastive learning and conditional computing.Contrastive learning determines the feature representation similarity to adjust the local model.Conditional computing separates the global and local information and feeds it to their corresponding heads for global and local handling.Our comprehensive experiments demonstrate that FedCLCC outperforms other state-of-the-art FL algorithms.展开更多
Background Cotton is one of the most important commercial crops after food crops,especially in countries like India,where it’s grown extensively under rainfed conditions.Because of its usage in multiple industries,su...Background Cotton is one of the most important commercial crops after food crops,especially in countries like India,where it’s grown extensively under rainfed conditions.Because of its usage in multiple industries,such as textile,medicine,and automobile industries,it has greater commercial importance.The crop’s performance is greatly influenced by prevailing weather dynamics.As climate changes,assessing how weather changes affect crop performance is essential.Among various techniques that are available,crop models are the most effective and widely used tools for predicting yields.Results This study compares statistical and machine learning models to assess their ability to predict cotton yield across major producing districts of Karnataka,India,utilizing a long-term dataset spanning from 1990 to 2023 that includes yield and weather factors.The artificial neural networks(ANNs)performed superiorly with acceptable yield deviations ranging within±10%during both vegetative stage(F1)and mid stage(F2)for cotton.The model evaluation metrics such as root mean square error(RMSE),normalized root mean square error(nRMSE),and modelling efficiency(EF)were also within the acceptance limits in most districts.Furthermore,the tested ANN model was used to assess the importance of the dominant weather factors influencing crop yield in each district.Specifically,the use of morning relative humidity as an individual parameter and its interaction with maximum and minimum tempera-ture had a major influence on cotton yield in most of the yield predicted districts.These differences highlighted the differential interactions of weather factors in each district for cotton yield formation,highlighting individual response of each weather factor under different soils and management conditions over the major cotton growing districts of Karnataka.Conclusions Compared with statistical models,machine learning models such as ANNs proved higher efficiency in forecasting the cotton yield due to their ability to consider the interactive effects of weather factors on yield forma-tion at different growth stages.This highlights the best suitability of ANNs for yield forecasting in rainfed conditions and for the study on relative impacts of weather factors on yield.Thus,the study aims to provide valuable insights to support stakeholders in planning effective crop management strategies and formulating relevant policies.展开更多
This paper concentrates on addressing the hypersonic glide vehicle(HGV)tracking problem considering the high maneuverability and non-stationary heavy-tailed measurement noise without prior statistics in complicated fl...This paper concentrates on addressing the hypersonic glide vehicle(HGV)tracking problem considering the high maneuverability and non-stationary heavy-tailed measurement noise without prior statistics in complicated flight environments.Since the interacting multiple model(IMM)filtering is famous with its ability to cover the movement property of motion models,the problem is formulated as modeling the non-stationary heavy-tailed measurement noise without any prior statistics in the IMM framework.Firstly,without any prior statistics,the Gaussian-inverse Wishart distribution is embedded in the improved Pearson type-VII(PTV)distribution,which can adaptively adjust the parameters to model the non-stationary heavytailed measurement noise.Besides,degree of freedom(DOF)parameters are surrogated by the maximization of evidence lower bound(ELBO)in the variational Bayesian optimization framework instead of fixed value to handle uncertain non-Gaussian degrees.Then,this paper analytically derives fusion forms based on the maximum Versoria fusion criterion instead of the moment matching approach,which can provide a precise approximation for the PTV mixture distribution in the mixing and output steps combined with the weight Kullback-Leibler average theory.Simulation results demonstrate the superiority and robustness of the proposed algorithm in typical HGVs tracking when the measurement noise without priori statistics is non-stationary.展开更多
Accurate assessment of coal brittleness is crucial in the design of coal seam drilling and underground coal mining operations.This study proposes a method for evaluating the brittleness of gas-bearing coal based on a ...Accurate assessment of coal brittleness is crucial in the design of coal seam drilling and underground coal mining operations.This study proposes a method for evaluating the brittleness of gas-bearing coal based on a statistical damage constitutive model and energy evolution mechanisms.Initially,integrating the principle of effective stress and the Hoek-Brown criterion,a statistical damage constitutive model for gas-bearing coal is established and validated through triaxial compression tests under different gas pressures to verify its accuracy and applicability.Subsequently,employing energy evolution mechanism,two energy characteristic parameters(elastic energy proportion and dissipated energy proportion)are analyzed.Based on the damage stress thresholds,the damage evolution characteristics of gas bearing coal were explored.Finally,by integrating energy characteristic parameters with damage parameters,a novel brittleness index is proposed.The results demonstrate that the theoretical curves derived from the statistical damage constitutive model closely align with the test curves,accurately reflecting the stress−strain characteristics of gas-bearing coal and revealing the stress drop and softening characteristics of coal in the post-peak stage.The shape parameter and scale parameter represent the brittleness and macroscopic strength of the coal,respectively.As gas pressure increases from 1 to 5 MPa,the shape parameter and the scale parameter decrease by 22.18%and 60.45%,respectively,indicating a reduction in both brittleness and strength of the coal.Parameters such as maximum damage rate and peak elastic energy storage limit positively correlate with coal brittleness.The brittleness index effectively captures the brittleness characteristics and reveals a decrease in brittleness and an increase in sensitivity to plastic deformation under higher gas pressure conditions.展开更多
随着气旋内部资料(Inner core data)在热带气旋预报中的使用,其重要性逐渐受到人们越来越多的关注。为了研究该资料中尾部机载雷达(Tail Doppler Radar,TDR)资料在业务系统中的应用效果,本文利用2012年飓风等级热带气旋Isaac期间的TDR资...随着气旋内部资料(Inner core data)在热带气旋预报中的使用,其重要性逐渐受到人们越来越多的关注。为了研究该资料中尾部机载雷达(Tail Doppler Radar,TDR)资料在业务系统中的应用效果,本文利用2012年飓风等级热带气旋Isaac期间的TDR资料,采用业务HWRF(Weather Research and Forecasting model for Hurricane)数值模式与业务GSI(Grid-point Statistical Interpolation system)三维变分同化(Three-Dimensional Variational Data Assimilation,3DVar)系统对TDR资料进行了同化,展开了一系列预报试验,并对其效果进行了分析和研究。结果表明与HWRF的业务预报相比,GSI系统同化TDR资料后对热带气旋的路径和强度预报有明显改进;但其同化效果同时也表明业务三维变分中的静态背景误差协方差在TDR资料的应用中仍需要进一步的改进。展开更多
W.E.Deming(1940),Discussion of Professor Hottelling’S Paper“The Teaching of Statistics”(Ann.Math.Stat.V01.11,457-470):Aboveall_astatisticianmustbeascientist.最重要的是,统计学家应该是科学家.
Achieving Six-Sigma process capability starts with l istening to the Voice of the Customers, and it becomes a reality by combining th e People Power and the Process Power of the organisation. This paper presents a Six...Achieving Six-Sigma process capability starts with l istening to the Voice of the Customers, and it becomes a reality by combining th e People Power and the Process Power of the organisation. This paper presents a Six-Sigma implementation case study carried out in a magnet manufacturing compa ny, which produces bearing magnets to be used in energy meters. If the thickness of the produced bearing magnets is between 2.35 mm and 2.50 mm, they will be ac cepted by the customers. All the time the company could not produce the bearing magnets within the specified thickness range, as their process distribution was flat with 2.20 mm as lower control limit and 2.60 mm as upper control limit. This resulted in a huge loss in the form of non-conformities, loss of time and goodwill. The process capability of the company then was around 0.40. Organisat ion restructuring was carried out to reap the benefit of the People Power of the organisation. Statistically designed experiments (Taguchi Method based Design o f Experiments), Online quality control tools (Statistical Process Control To ols) were effectively used to complete the DMAIC (Define, Measure, Analyse, Impr ove and Control) cycle to reap the benefit of the Process Power of the organisat ion. Presently the company enjoys a process capability of 1.75, a way towards Si x-Sigma Process Capability.展开更多
基金Supported by the National Natural Science Foundation of China(12061017,12361055)the Research Fund of Guangxi Key Lab of Multi-source Information Mining&Security(22-A-01-01)。
文摘Existing blockwise empirical likelihood(BEL)method blocks the observations or their analogues,which is proven useful under some dependent data settings.In this paper,we introduce a new BEL(NBEL)method by blocking the scoring functions under high dimensional cases.We study the construction of confidence regions for the parameters in spatial autoregressive models with spatial autoregressive disturbances(SARAR models)with high dimension of parameters by using the NBEL method.It is shown that the NBEL ratio statistics are asymptoticallyχ^(2)-type distributed,which are used to obtain the NBEL based confidence regions for the parameters in SARAR models.A simulation study is conducted to compare the performances of the NBEL and the usual EL methods.
文摘Risk management often plays an important role in decision making un-der uncertainty.In quantitative risk management,assessing and optimizing risk metrics requires eficient computing techniques and reliable theoretical guarantees.In this pa-per,we introduce several topics on quantitative risk management and review some of the recent studies and advancements on the topics.We consider several risk metrics and study decision models that involve the metrics,with a main focus on the related com-puting techniques and theoretical properties.We show that stochastic optimization,as a powerful tool,can be leveraged to effectively address these problems.
文摘Crocuta and Pachycrocuta are widely regarded as the most prevalent and emblematic hyenas across Eurasia during Quaternary.They are easily distinguished by their distinctive carnassial teeth.However,the disparities in non-carnassial elements are less pronounced and have received minimal attention in previous studies.This has resulted in erroneous identifications when dealing with fragmented specimens,particularly in cases where carnassial teeth are poorly preserved or absent.Such misidentifications have the potential to give rise to erroneous inferences regarding the paleozoogeography and biochronology of the animals in question.The bone-cracking hyena specimens from Huainan,Anhui(Tseng et al.,2008)are re-examined and re-evaluated here through a series of morphological comparisons and data analyses(univariate,bivariate,and multivariate analyses etc.).The results provide unequivocal confirmation that the specimens from Xiliexi are not spotted hyenas,but belong to Pachycrocuta perrieri instead.Conversely,the specimen from Dadingshan is the only genuine representative of the spotted hyena,which is supposed to be a possible earliest fossil record for Crocuta ultima thus far in China.Furthermore,the disparities in dentognathic morphology between Crocuta and Pachycrocuta are systematically summarized and analyzed,with an explanation of their eco-functional significance.The present study hypothesizes that Pachycrocuta retains a certain degree of active predatory capability,attributable to the robustness of the canine and the symphysis of the jaw,among other factors.This finding indicates that Pachycrocuta exhibits a form of flexible foraging behavior,combining opportunistic scavenging and active hunting in a manner analogous to Crocuta.Finally,the dynamic evolutionary history of hyenas in East China since Pleistocene has also been broadly reconstructed.
基金supported by the Natural Science Foundation of Xinjiang Uygur Autonomous Region(Grant No.2022D01B 187)。
文摘Federated learning(FL)is a distributed machine learning paradigm for edge cloud computing.FL can facilitate data-driven decision-making in tactical scenarios,effectively addressing both data volume and infrastructure challenges in edge environments.However,the diversity of clients in edge cloud computing presents significant challenges for FL.Personalized federated learning(pFL)received considerable attention in recent years.One example of pFL involves exploiting the global and local information in the local model.Current pFL algorithms experience limitations such as slow convergence speed,catastrophic forgetting,and poor performance in complex tasks,which still have significant shortcomings compared to the centralized learning.To achieve high pFL performance,we propose FedCLCC:Federated Contrastive Learning and Conditional Computing.The core of FedCLCC is the use of contrastive learning and conditional computing.Contrastive learning determines the feature representation similarity to adjust the local model.Conditional computing separates the global and local information and feeds it to their corresponding heads for global and local handling.Our comprehensive experiments demonstrate that FedCLCC outperforms other state-of-the-art FL algorithms.
基金funded through India Meteorological Department,New Delhi,India under the Forecasting Agricultural output using Space,Agrometeorol ogy and Land based observations(FASAL)project and fund number:No.ASC/FASAL/KT-11/01/HQ-2010.
文摘Background Cotton is one of the most important commercial crops after food crops,especially in countries like India,where it’s grown extensively under rainfed conditions.Because of its usage in multiple industries,such as textile,medicine,and automobile industries,it has greater commercial importance.The crop’s performance is greatly influenced by prevailing weather dynamics.As climate changes,assessing how weather changes affect crop performance is essential.Among various techniques that are available,crop models are the most effective and widely used tools for predicting yields.Results This study compares statistical and machine learning models to assess their ability to predict cotton yield across major producing districts of Karnataka,India,utilizing a long-term dataset spanning from 1990 to 2023 that includes yield and weather factors.The artificial neural networks(ANNs)performed superiorly with acceptable yield deviations ranging within±10%during both vegetative stage(F1)and mid stage(F2)for cotton.The model evaluation metrics such as root mean square error(RMSE),normalized root mean square error(nRMSE),and modelling efficiency(EF)were also within the acceptance limits in most districts.Furthermore,the tested ANN model was used to assess the importance of the dominant weather factors influencing crop yield in each district.Specifically,the use of morning relative humidity as an individual parameter and its interaction with maximum and minimum tempera-ture had a major influence on cotton yield in most of the yield predicted districts.These differences highlighted the differential interactions of weather factors in each district for cotton yield formation,highlighting individual response of each weather factor under different soils and management conditions over the major cotton growing districts of Karnataka.Conclusions Compared with statistical models,machine learning models such as ANNs proved higher efficiency in forecasting the cotton yield due to their ability to consider the interactive effects of weather factors on yield forma-tion at different growth stages.This highlights the best suitability of ANNs for yield forecasting in rainfed conditions and for the study on relative impacts of weather factors on yield.Thus,the study aims to provide valuable insights to support stakeholders in planning effective crop management strategies and formulating relevant policies.
基金supported by the National Natural Science Foundation of China(12072090).
文摘This paper concentrates on addressing the hypersonic glide vehicle(HGV)tracking problem considering the high maneuverability and non-stationary heavy-tailed measurement noise without prior statistics in complicated flight environments.Since the interacting multiple model(IMM)filtering is famous with its ability to cover the movement property of motion models,the problem is formulated as modeling the non-stationary heavy-tailed measurement noise without any prior statistics in the IMM framework.Firstly,without any prior statistics,the Gaussian-inverse Wishart distribution is embedded in the improved Pearson type-VII(PTV)distribution,which can adaptively adjust the parameters to model the non-stationary heavytailed measurement noise.Besides,degree of freedom(DOF)parameters are surrogated by the maximization of evidence lower bound(ELBO)in the variational Bayesian optimization framework instead of fixed value to handle uncertain non-Gaussian degrees.Then,this paper analytically derives fusion forms based on the maximum Versoria fusion criterion instead of the moment matching approach,which can provide a precise approximation for the PTV mixture distribution in the mixing and output steps combined with the weight Kullback-Leibler average theory.Simulation results demonstrate the superiority and robustness of the proposed algorithm in typical HGVs tracking when the measurement noise without priori statistics is non-stationary.
基金Project(52274096)supported by the National Natural Science Foundation of ChinaProject(WS2023A03)supported by the State Key Laboratory Cultivation Base for Gas Geology and Gas Control,China。
文摘Accurate assessment of coal brittleness is crucial in the design of coal seam drilling and underground coal mining operations.This study proposes a method for evaluating the brittleness of gas-bearing coal based on a statistical damage constitutive model and energy evolution mechanisms.Initially,integrating the principle of effective stress and the Hoek-Brown criterion,a statistical damage constitutive model for gas-bearing coal is established and validated through triaxial compression tests under different gas pressures to verify its accuracy and applicability.Subsequently,employing energy evolution mechanism,two energy characteristic parameters(elastic energy proportion and dissipated energy proportion)are analyzed.Based on the damage stress thresholds,the damage evolution characteristics of gas bearing coal were explored.Finally,by integrating energy characteristic parameters with damage parameters,a novel brittleness index is proposed.The results demonstrate that the theoretical curves derived from the statistical damage constitutive model closely align with the test curves,accurately reflecting the stress−strain characteristics of gas-bearing coal and revealing the stress drop and softening characteristics of coal in the post-peak stage.The shape parameter and scale parameter represent the brittleness and macroscopic strength of the coal,respectively.As gas pressure increases from 1 to 5 MPa,the shape parameter and the scale parameter decrease by 22.18%and 60.45%,respectively,indicating a reduction in both brittleness and strength of the coal.Parameters such as maximum damage rate and peak elastic energy storage limit positively correlate with coal brittleness.The brittleness index effectively captures the brittleness characteristics and reveals a decrease in brittleness and an increase in sensitivity to plastic deformation under higher gas pressure conditions.
文摘随着气旋内部资料(Inner core data)在热带气旋预报中的使用,其重要性逐渐受到人们越来越多的关注。为了研究该资料中尾部机载雷达(Tail Doppler Radar,TDR)资料在业务系统中的应用效果,本文利用2012年飓风等级热带气旋Isaac期间的TDR资料,采用业务HWRF(Weather Research and Forecasting model for Hurricane)数值模式与业务GSI(Grid-point Statistical Interpolation system)三维变分同化(Three-Dimensional Variational Data Assimilation,3DVar)系统对TDR资料进行了同化,展开了一系列预报试验,并对其效果进行了分析和研究。结果表明与HWRF的业务预报相比,GSI系统同化TDR资料后对热带气旋的路径和强度预报有明显改进;但其同化效果同时也表明业务三维变分中的静态背景误差协方差在TDR资料的应用中仍需要进一步的改进。
文摘W.E.Deming(1940),Discussion of Professor Hottelling’S Paper“The Teaching of Statistics”(Ann.Math.Stat.V01.11,457-470):Aboveall_astatisticianmustbeascientist.最重要的是,统计学家应该是科学家.
文摘Achieving Six-Sigma process capability starts with l istening to the Voice of the Customers, and it becomes a reality by combining th e People Power and the Process Power of the organisation. This paper presents a Six-Sigma implementation case study carried out in a magnet manufacturing compa ny, which produces bearing magnets to be used in energy meters. If the thickness of the produced bearing magnets is between 2.35 mm and 2.50 mm, they will be ac cepted by the customers. All the time the company could not produce the bearing magnets within the specified thickness range, as their process distribution was flat with 2.20 mm as lower control limit and 2.60 mm as upper control limit. This resulted in a huge loss in the form of non-conformities, loss of time and goodwill. The process capability of the company then was around 0.40. Organisat ion restructuring was carried out to reap the benefit of the People Power of the organisation. Statistically designed experiments (Taguchi Method based Design o f Experiments), Online quality control tools (Statistical Process Control To ols) were effectively used to complete the DMAIC (Define, Measure, Analyse, Impr ove and Control) cycle to reap the benefit of the Process Power of the organisat ion. Presently the company enjoys a process capability of 1.75, a way towards Si x-Sigma Process Capability.