Background Cotton is one of the most important commercial crops after food crops,especially in countries like India,where it’s grown extensively under rainfed conditions.Because of its usage in multiple industries,su...Background Cotton is one of the most important commercial crops after food crops,especially in countries like India,where it’s grown extensively under rainfed conditions.Because of its usage in multiple industries,such as textile,medicine,and automobile industries,it has greater commercial importance.The crop’s performance is greatly influenced by prevailing weather dynamics.As climate changes,assessing how weather changes affect crop performance is essential.Among various techniques that are available,crop models are the most effective and widely used tools for predicting yields.Results This study compares statistical and machine learning models to assess their ability to predict cotton yield across major producing districts of Karnataka,India,utilizing a long-term dataset spanning from 1990 to 2023 that includes yield and weather factors.The artificial neural networks(ANNs)performed superiorly with acceptable yield deviations ranging within±10%during both vegetative stage(F1)and mid stage(F2)for cotton.The model evaluation metrics such as root mean square error(RMSE),normalized root mean square error(nRMSE),and modelling efficiency(EF)were also within the acceptance limits in most districts.Furthermore,the tested ANN model was used to assess the importance of the dominant weather factors influencing crop yield in each district.Specifically,the use of morning relative humidity as an individual parameter and its interaction with maximum and minimum tempera-ture had a major influence on cotton yield in most of the yield predicted districts.These differences highlighted the differential interactions of weather factors in each district for cotton yield formation,highlighting individual response of each weather factor under different soils and management conditions over the major cotton growing districts of Karnataka.Conclusions Compared with statistical models,machine learning models such as ANNs proved higher efficiency in forecasting the cotton yield due to their ability to consider the interactive effects of weather factors on yield forma-tion at different growth stages.This highlights the best suitability of ANNs for yield forecasting in rainfed conditions and for the study on relative impacts of weather factors on yield.Thus,the study aims to provide valuable insights to support stakeholders in planning effective crop management strategies and formulating relevant policies.展开更多
Accurate assessment of coal brittleness is crucial in the design of coal seam drilling and underground coal mining operations.This study proposes a method for evaluating the brittleness of gas-bearing coal based on a ...Accurate assessment of coal brittleness is crucial in the design of coal seam drilling and underground coal mining operations.This study proposes a method for evaluating the brittleness of gas-bearing coal based on a statistical damage constitutive model and energy evolution mechanisms.Initially,integrating the principle of effective stress and the Hoek-Brown criterion,a statistical damage constitutive model for gas-bearing coal is established and validated through triaxial compression tests under different gas pressures to verify its accuracy and applicability.Subsequently,employing energy evolution mechanism,two energy characteristic parameters(elastic energy proportion and dissipated energy proportion)are analyzed.Based on the damage stress thresholds,the damage evolution characteristics of gas bearing coal were explored.Finally,by integrating energy characteristic parameters with damage parameters,a novel brittleness index is proposed.The results demonstrate that the theoretical curves derived from the statistical damage constitutive model closely align with the test curves,accurately reflecting the stress−strain characteristics of gas-bearing coal and revealing the stress drop and softening characteristics of coal in the post-peak stage.The shape parameter and scale parameter represent the brittleness and macroscopic strength of the coal,respectively.As gas pressure increases from 1 to 5 MPa,the shape parameter and the scale parameter decrease by 22.18%and 60.45%,respectively,indicating a reduction in both brittleness and strength of the coal.Parameters such as maximum damage rate and peak elastic energy storage limit positively correlate with coal brittleness.The brittleness index effectively captures the brittleness characteristics and reveals a decrease in brittleness and an increase in sensitivity to plastic deformation under higher gas pressure conditions.展开更多
Dynamic tensile impact properties of aramid (Technora) and UHMWPE (DC851) fiber bundles were studied at two high strain rates by means of reflecting type Split Hopkinson Bar, and stress-strain curves of fiber yarns ...Dynamic tensile impact properties of aramid (Technora) and UHMWPE (DC851) fiber bundles were studied at two high strain rates by means of reflecting type Split Hopkinson Bar, and stress-strain curves of fiber yarns at different strain rates were obtained. Experimental results show that the initial elastic modulus, failure strength and unstable strain of aramid fiber yarns are strain rate insensitive, whereas the initial elastic modulus and unstable strain of UHMWPE fiber yarns are strain rate sensitive. A fiber-bundle statistical constitutive equation was used to describe the tensile behavior of aramid and UHMWPE fiber bundles at high strain rates. The good consistency between the simulated results and experimental data indicates that the modified double Weibull function can represent the tensile strength distribution of aramid and UHMWPE fibers and the method of extracting Weibull parameters from fiber bundles stress-strain data is valid.展开更多
Multivariate statistical techniques,such as cluster analysis(CA),discriminant analysis(DA),principal component analysis(PCA) and factor analysis(FA),were applied to evaluate and interpret the surface water quality dat...Multivariate statistical techniques,such as cluster analysis(CA),discriminant analysis(DA),principal component analysis(PCA) and factor analysis(FA),were applied to evaluate and interpret the surface water quality data sets of the Second Songhua River(SSHR) basin in China,obtained during two years(2012-2013) of monitoring of 10 physicochemical parameters at 15 different sites.The results showed that most of physicochemical parameters varied significantly among the sampling sites.Three significant groups,highly polluted(HP),moderately polluted(MP) and less polluted(LP),of sampling sites were obtained through Hierarchical agglomerative CA on the basis of similarity of water quality characteristics.DA identified p H,F,DO,NH3-N,COD and VPhs were the most important parameters contributing to spatial variations of surface water quality.However,DA did not give a considerable data reduction(40% reduction).PCA/FA resulted in three,three and four latent factors explaining 70%,62% and 71% of the total variance in water quality data sets of HP,MP and LP regions,respectively.FA revealed that the SSHR water chemistry was strongly affected by anthropogenic activities(point sources:industrial effluents and wastewater treatment plants;non-point sources:domestic sewage,livestock operations and agricultural activities) and natural processes(seasonal effect,and natural inputs).PCA/FA in the whole basin showed the best results for data reduction because it used only two parameters(about 80% reduction) as the most important parameters to explain 72% of the data variation.Thus,this work illustrated the utility of multivariate statistical techniques for analysis and interpretation of datasets and,in water quality assessment,identification of pollution sources/factors and understanding spatial variations in water quality for effective stream water quality management.展开更多
The instantaneous frequency (IF) estimation of the linear frequency modulated (LFM) signals with time-varying amplitude using the peak of the Wigner-Ville distribution (WVD) is studied. Theoretical analysis show...The instantaneous frequency (IF) estimation of the linear frequency modulated (LFM) signals with time-varying amplitude using the peak of the Wigner-Ville distribution (WVD) is studied. Theoretical analysis shows that the estimation on LFM signals with time-varying amplitude is unbiased, only if WVD of time-varying amplitude reaches its maximum at frequency zero no matter in which time. The statistical performance in the case of additive white Guassian noise is evaluated and an analytical expression for the variance is provided. The simulations using LFM signals with Gaussian envelope testify that IF can be estimated accurately using the peak of WVD for four models of amplitude variation. Furthermore the statistical result of estimation on the signals with amplitude descending before rising is better than that of the signals with constant amplitude when the amplitude variation rate is moderate.展开更多
This work correlated the detailed work zone location and time data from the Wis LCS system with the five-min inductive loop detector data. One-sample percentile value test and two-sample Kolmogorov-Smirnov(K-S) test w...This work correlated the detailed work zone location and time data from the Wis LCS system with the five-min inductive loop detector data. One-sample percentile value test and two-sample Kolmogorov-Smirnov(K-S) test were applied to compare the speed and flow characteristics between work zone and non-work zone conditions. Furthermore, we analyzed the mobility characteristics of freeway work zones within the urban area of Milwaukee, WI, USA. More than 50% of investigated work zones have experienced speed reduction and 15%-30% is necessary reduced volumes. Speed reduction was more significant within and at the downstream of work zones than at the upstream.展开更多
Head-driven statistical models for natural language parsing are the most representative lexicalized syntactic parsing models, but they only utilize semantic dependency between words, and do not incorporate other seman...Head-driven statistical models for natural language parsing are the most representative lexicalized syntactic parsing models, but they only utilize semantic dependency between words, and do not incorporate other semantic information such as semantic collocation and semantic category. Some improvements on this distinctive parser are presented. Firstly, "valency" is an essential semantic feature of words. Once the valency of word is determined, the collocation of the word is clear, and the sentence structure can be directly derived. Thus, a syntactic parsing model combining valence structure with semantic dependency is purposed on the base of head-driven statistical syntactic parsing models. Secondly, semantic role labeling(SRL) is very necessary for deep natural language processing. An integrated parsing approach is proposed to integrate semantic parsing into the syntactic parsing process. Experiments are conducted for the refined statistical parser. The results show that 87.12% precision and 85.04% recall are obtained, and F measure is improved by 5.68% compared with the head-driven parsing model introduced by Collins.展开更多
The basic"current"statistical model and adaptive Kalman filter algorithm can not track a weakly maneuvering target precisely,though it has good estimate accuracy for strongly maneuvering target.In order to s...The basic"current"statistical model and adaptive Kalman filter algorithm can not track a weakly maneuvering target precisely,though it has good estimate accuracy for strongly maneuvering target.In order to solve this problem,a novel nonlinear fuzzy membership function was presented to adjust the upper and lower limit of target acceleration adaptively,and then the validity of the new algorithm for feeblish maneuvering target was proved in theory.At last,the computer simulation experiments indicated that the new algorithm has a great advantage over the basic"current"statistical model and adaptive algorithm.展开更多
The Okiep Copper District is the oldest mining district in South Africa with a legacy of more than 150 years of mining.This legacy can be felt in the presence of large tailings dams scattered throughout the area.These...The Okiep Copper District is the oldest mining district in South Africa with a legacy of more than 150 years of mining.This legacy can be felt in the presence of large tailings dams scattered throughout the area.These tailings have a deleterious impact on the surrounding environment.To use geochemical methods in determining the scale of the impact, pre-mining background levels need to be determined. This is especially difficult in areas for which展开更多
The study of land surface temperature(LST)is of great significance for ecosystem monitoring and ecological environmental protection in the Qinling Mountains of China.In view of the contradicting spatial and temporal r...The study of land surface temperature(LST)is of great significance for ecosystem monitoring and ecological environmental protection in the Qinling Mountains of China.In view of the contradicting spatial and temporal resolutions in extracting LST from satellite remote sensing(RS)data,the areas with complex landforms of the Eastern Qinling Mountains were selected as the research targets to establish the correlation between the normalized difference vegetation index(NDVI)and LST.Detailed information on the surface features and temporal changes in the land surface was provided by Sentinel-2 and Sentinel-3,respectively.Based on the statistically downscaling method,the spatial scale could be decreased from 1000 m to 10 m,and LST with a Sentinel-3 temporal resolution and a 10 m spatial resolution could be retrieved.Comparing the 1 km resolution Sentinel-3 LST with the downscaling results,the 10 m LST downscaling data could accurately reflect the spatial distribution of the thermal characteristics of the original LST image.Moreover,the surface temperature data with a 10 m high spatial resolution had clear texture and obvious geomorphic features that could depict the detailed information of the ground features.The results showed that the average error was 5 K on April 16,2019 and 2.6 K on July 15,2019.The smaller error values indicated the higher vegetation coverage of summer downscaling result with the highest level on July 15.展开更多
A three-dimensional transient numerical simulation was conducted to study the pressure fluctuations in low-specific-speed centrifugal pumps. The characteristics of the inner flow were investigated using the SST k-ω t...A three-dimensional transient numerical simulation was conducted to study the pressure fluctuations in low-specific-speed centrifugal pumps. The characteristics of the inner flow were investigated using the SST k-ω turbulence model. The distributions of pressure fluctuations in the impeller and the volute were recorded, and the pressure fluctuation intensity was analyzed comprehensively, at the design condition, using statistical methods. The results show that the pressure fluctuation intensity increases along the impeller streamline from the leading edge to the trailing edge. In the impeller passage, the intensity near the shroud is much higher than that near the hub at thc inlet. However, the intensity at the middle passage is almost equal to the intensity at the outlet. The pressure fluctuation intensity is the highest at the trailing edge on the pressure side and near the tongue because of the rotor-stator interaction. The distribution of pressure fluctuation intensity is symmetrical in the axial cross sections of the volute channel. However, this intensity decreases with increasing radial distance. Hence, the pressure fluctuation intensity can be reduced by modifying the geometry of the leading edge in the impeller and the tongue in the volute.展开更多
The paper describes field test results of 7.62×51 mm M61 AP(armour piercing) ammunition fired into mild steel targets at an outdoor range.The targets varied from 10 mm to 32 mm in thickness.The tests recorded pen...The paper describes field test results of 7.62×51 mm M61 AP(armour piercing) ammunition fired into mild steel targets at an outdoor range.The targets varied from 10 mm to 32 mm in thickness.The tests recorded penetration depth,probability of perforation(i.e.,complete penetration),muzzle and impact velocities,bullet mass,and plate yield strength and hardness.The measured penetration depth exhibited a variability of approximately±12%.The paper then compared ballistic test results with predictive models of steel penetration depth and thickness to prevent perforation.Statistical parameters were derived for muzzle and impact velocity,bullet mass,plate thickness,plate hardness,and model error.A Monte-Carlo probabilistic analysis was then developed to estimate the probability of plate perforation of 7.62 mm M61 AP ammunition for a range of impact velocities,and for mild steels,and High Hardness Armour(HHA) plates.This perforation fragility analysis considered the random variability of impact velocity,bullet mass,plate thickness,plate hardness,and model error.Such a probabilistic analysis allows for reliability-based design,where,for example,the plate thickness with 95% reliability(i.e.only 1 in 20 shots will penetrate the wall) can be estimated knowing the probabilistic distribution of perforation.Hence,it was found that the plate thickness to ensure a low 5% probability of perforation needs to be 11-15% thicker than required to have a 50/50 chance of perforation for mild steel plates.Plates would need to be 20-30% thicker if probability of perforation is reduced to zero.展开更多
A multivariate method for fault diagnosis and process monitoring is proposed. This technique is based on a statistical pattern(SP) framework integrated with a self-organizing map(SOM). An SP-based SOM is used as a cla...A multivariate method for fault diagnosis and process monitoring is proposed. This technique is based on a statistical pattern(SP) framework integrated with a self-organizing map(SOM). An SP-based SOM is used as a classifier to distinguish various states on the output map, which can visually monitor abnormal states. A case study of the Tennessee Eastman(TE) process is presented to demonstrate the fault diagnosis and process monitoring performance of the proposed method. Results show that the SP-based SOM method is a visual tool for real-time monitoring and fault diagnosis that can be used in complex chemical processes.Compared with other SOM-based methods, the proposed method can more efficiently monitor and diagnose faults.展开更多
A multifeature statistical image segmentation algorithm is described. Multiple features such as grey, edge magnitude and correlation are combined to form a multidimensional space statistics. The statistical algorithm ...A multifeature statistical image segmentation algorithm is described. Multiple features such as grey, edge magnitude and correlation are combined to form a multidimensional space statistics. The statistical algorithm is used to segment an image using the decision curved surface determined by the multidimensional feature function. The segmentation problem which is difficult to solve using the features independently will be readily solved using the same features jointly. An adaptive segmentation algorithm is discussed. Test results of the real-time TV tracker newly developed have shown that the segmentation algorithm discussed here improves effectively the image segmentation quality and system tracking performance.展开更多
Existing blockwise empirical likelihood(BEL)method blocks the observations or their analogues,which is proven useful under some dependent data settings.In this paper,we introduce a new BEL(NBEL)method by blocking the ...Existing blockwise empirical likelihood(BEL)method blocks the observations or their analogues,which is proven useful under some dependent data settings.In this paper,we introduce a new BEL(NBEL)method by blocking the scoring functions under high dimensional cases.We study the construction of confidence regions for the parameters in spatial autoregressive models with spatial autoregressive disturbances(SARAR models)with high dimension of parameters by using the NBEL method.It is shown that the NBEL ratio statistics are asymptoticallyχ^(2)-type distributed,which are used to obtain the NBEL based confidence regions for the parameters in SARAR models.A simulation study is conducted to compare the performances of the NBEL and the usual EL methods.展开更多
Risk management often plays an important role in decision making un-der uncertainty.In quantitative risk management,assessing and optimizing risk metrics requires eficient computing techniques and reliable theoretical...Risk management often plays an important role in decision making un-der uncertainty.In quantitative risk management,assessing and optimizing risk metrics requires eficient computing techniques and reliable theoretical guarantees.In this pa-per,we introduce several topics on quantitative risk management and review some of the recent studies and advancements on the topics.We consider several risk metrics and study decision models that involve the metrics,with a main focus on the related com-puting techniques and theoretical properties.We show that stochastic optimization,as a powerful tool,can be leveraged to effectively address these problems.展开更多
Federated learning(FL)is a distributed machine learning paradigm for edge cloud computing.FL can facilitate data-driven decision-making in tactical scenarios,effectively addressing both data volume and infrastructure ...Federated learning(FL)is a distributed machine learning paradigm for edge cloud computing.FL can facilitate data-driven decision-making in tactical scenarios,effectively addressing both data volume and infrastructure challenges in edge environments.However,the diversity of clients in edge cloud computing presents significant challenges for FL.Personalized federated learning(pFL)received considerable attention in recent years.One example of pFL involves exploiting the global and local information in the local model.Current pFL algorithms experience limitations such as slow convergence speed,catastrophic forgetting,and poor performance in complex tasks,which still have significant shortcomings compared to the centralized learning.To achieve high pFL performance,we propose FedCLCC:Federated Contrastive Learning and Conditional Computing.The core of FedCLCC is the use of contrastive learning and conditional computing.Contrastive learning determines the feature representation similarity to adjust the local model.Conditional computing separates the global and local information and feeds it to their corresponding heads for global and local handling.Our comprehensive experiments demonstrate that FedCLCC outperforms other state-of-the-art FL algorithms.展开更多
This paper concentrates on addressing the hypersonic glide vehicle(HGV)tracking problem considering the high maneuverability and non-stationary heavy-tailed measurement noise without prior statistics in complicated fl...This paper concentrates on addressing the hypersonic glide vehicle(HGV)tracking problem considering the high maneuverability and non-stationary heavy-tailed measurement noise without prior statistics in complicated flight environments.Since the interacting multiple model(IMM)filtering is famous with its ability to cover the movement property of motion models,the problem is formulated as modeling the non-stationary heavy-tailed measurement noise without any prior statistics in the IMM framework.Firstly,without any prior statistics,the Gaussian-inverse Wishart distribution is embedded in the improved Pearson type-VII(PTV)distribution,which can adaptively adjust the parameters to model the non-stationary heavytailed measurement noise.Besides,degree of freedom(DOF)parameters are surrogated by the maximization of evidence lower bound(ELBO)in the variational Bayesian optimization framework instead of fixed value to handle uncertain non-Gaussian degrees.Then,this paper analytically derives fusion forms based on the maximum Versoria fusion criterion instead of the moment matching approach,which can provide a precise approximation for the PTV mixture distribution in the mixing and output steps combined with the weight Kullback-Leibler average theory.Simulation results demonstrate the superiority and robustness of the proposed algorithm in typical HGVs tracking when the measurement noise without priori statistics is non-stationary.展开更多
This paper developed a statistical damage constitutive model for deep rock by considering the effects of external load and thermal treatment temperature based on the distortion energy.The model parameters were determi...This paper developed a statistical damage constitutive model for deep rock by considering the effects of external load and thermal treatment temperature based on the distortion energy.The model parameters were determined through the extremum features of stress−strain curve.Subsequently,the model predictions were compared with experimental results of marble samples.It is found that when the treatment temperature rises,the coupling damage evolution curve shows an S-shape and the slope of ascending branch gradually decreases during the coupling damage evolution process.At a constant temperature,confining pressure can suppress the expansion of micro-fractures.As the confining pressure increases the rock exhibits ductility characteristics,and the shape of coupling damage curve changes from an S-shape into a quasi-parabolic shape.This model can well characterize the influence of high temperature on the mechanical properties of deep rock and its brittleness-ductility transition characteristics under confining pressure.Also,it is suitable for sandstone and granite,especially in predicting the pre-peak stage and peak stress of stress−strain curve under the coupling action of confining pressure and high temperature.The relevant results can provide a reference for further research on the constitutive relationship of rock-like materials and their engineering applications.展开更多
基金funded through India Meteorological Department,New Delhi,India under the Forecasting Agricultural output using Space,Agrometeorol ogy and Land based observations(FASAL)project and fund number:No.ASC/FASAL/KT-11/01/HQ-2010.
文摘Background Cotton is one of the most important commercial crops after food crops,especially in countries like India,where it’s grown extensively under rainfed conditions.Because of its usage in multiple industries,such as textile,medicine,and automobile industries,it has greater commercial importance.The crop’s performance is greatly influenced by prevailing weather dynamics.As climate changes,assessing how weather changes affect crop performance is essential.Among various techniques that are available,crop models are the most effective and widely used tools for predicting yields.Results This study compares statistical and machine learning models to assess their ability to predict cotton yield across major producing districts of Karnataka,India,utilizing a long-term dataset spanning from 1990 to 2023 that includes yield and weather factors.The artificial neural networks(ANNs)performed superiorly with acceptable yield deviations ranging within±10%during both vegetative stage(F1)and mid stage(F2)for cotton.The model evaluation metrics such as root mean square error(RMSE),normalized root mean square error(nRMSE),and modelling efficiency(EF)were also within the acceptance limits in most districts.Furthermore,the tested ANN model was used to assess the importance of the dominant weather factors influencing crop yield in each district.Specifically,the use of morning relative humidity as an individual parameter and its interaction with maximum and minimum tempera-ture had a major influence on cotton yield in most of the yield predicted districts.These differences highlighted the differential interactions of weather factors in each district for cotton yield formation,highlighting individual response of each weather factor under different soils and management conditions over the major cotton growing districts of Karnataka.Conclusions Compared with statistical models,machine learning models such as ANNs proved higher efficiency in forecasting the cotton yield due to their ability to consider the interactive effects of weather factors on yield forma-tion at different growth stages.This highlights the best suitability of ANNs for yield forecasting in rainfed conditions and for the study on relative impacts of weather factors on yield.Thus,the study aims to provide valuable insights to support stakeholders in planning effective crop management strategies and formulating relevant policies.
基金Project(52274096)supported by the National Natural Science Foundation of ChinaProject(WS2023A03)supported by the State Key Laboratory Cultivation Base for Gas Geology and Gas Control,China。
文摘Accurate assessment of coal brittleness is crucial in the design of coal seam drilling and underground coal mining operations.This study proposes a method for evaluating the brittleness of gas-bearing coal based on a statistical damage constitutive model and energy evolution mechanisms.Initially,integrating the principle of effective stress and the Hoek-Brown criterion,a statistical damage constitutive model for gas-bearing coal is established and validated through triaxial compression tests under different gas pressures to verify its accuracy and applicability.Subsequently,employing energy evolution mechanism,two energy characteristic parameters(elastic energy proportion and dissipated energy proportion)are analyzed.Based on the damage stress thresholds,the damage evolution characteristics of gas bearing coal were explored.Finally,by integrating energy characteristic parameters with damage parameters,a novel brittleness index is proposed.The results demonstrate that the theoretical curves derived from the statistical damage constitutive model closely align with the test curves,accurately reflecting the stress−strain characteristics of gas-bearing coal and revealing the stress drop and softening characteristics of coal in the post-peak stage.The shape parameter and scale parameter represent the brittleness and macroscopic strength of the coal,respectively.As gas pressure increases from 1 to 5 MPa,the shape parameter and the scale parameter decrease by 22.18%and 60.45%,respectively,indicating a reduction in both brittleness and strength of the coal.Parameters such as maximum damage rate and peak elastic energy storage limit positively correlate with coal brittleness.The brittleness index effectively captures the brittleness characteristics and reveals a decrease in brittleness and an increase in sensitivity to plastic deformation under higher gas pressure conditions.
文摘Dynamic tensile impact properties of aramid (Technora) and UHMWPE (DC851) fiber bundles were studied at two high strain rates by means of reflecting type Split Hopkinson Bar, and stress-strain curves of fiber yarns at different strain rates were obtained. Experimental results show that the initial elastic modulus, failure strength and unstable strain of aramid fiber yarns are strain rate insensitive, whereas the initial elastic modulus and unstable strain of UHMWPE fiber yarns are strain rate sensitive. A fiber-bundle statistical constitutive equation was used to describe the tensile behavior of aramid and UHMWPE fiber bundles at high strain rates. The good consistency between the simulated results and experimental data indicates that the modified double Weibull function can represent the tensile strength distribution of aramid and UHMWPE fibers and the method of extracting Weibull parameters from fiber bundles stress-strain data is valid.
基金Project (2012ZX07501002-001) supported by the Ministry of Science and Technology of China
文摘Multivariate statistical techniques,such as cluster analysis(CA),discriminant analysis(DA),principal component analysis(PCA) and factor analysis(FA),were applied to evaluate and interpret the surface water quality data sets of the Second Songhua River(SSHR) basin in China,obtained during two years(2012-2013) of monitoring of 10 physicochemical parameters at 15 different sites.The results showed that most of physicochemical parameters varied significantly among the sampling sites.Three significant groups,highly polluted(HP),moderately polluted(MP) and less polluted(LP),of sampling sites were obtained through Hierarchical agglomerative CA on the basis of similarity of water quality characteristics.DA identified p H,F,DO,NH3-N,COD and VPhs were the most important parameters contributing to spatial variations of surface water quality.However,DA did not give a considerable data reduction(40% reduction).PCA/FA resulted in three,three and four latent factors explaining 70%,62% and 71% of the total variance in water quality data sets of HP,MP and LP regions,respectively.FA revealed that the SSHR water chemistry was strongly affected by anthropogenic activities(point sources:industrial effluents and wastewater treatment plants;non-point sources:domestic sewage,livestock operations and agricultural activities) and natural processes(seasonal effect,and natural inputs).PCA/FA in the whole basin showed the best results for data reduction because it used only two parameters(about 80% reduction) as the most important parameters to explain 72% of the data variation.Thus,this work illustrated the utility of multivariate statistical techniques for analysis and interpretation of datasets and,in water quality assessment,identification of pollution sources/factors and understanding spatial variations in water quality for effective stream water quality management.
文摘The instantaneous frequency (IF) estimation of the linear frequency modulated (LFM) signals with time-varying amplitude using the peak of the Wigner-Ville distribution (WVD) is studied. Theoretical analysis shows that the estimation on LFM signals with time-varying amplitude is unbiased, only if WVD of time-varying amplitude reaches its maximum at frequency zero no matter in which time. The statistical performance in the case of additive white Guassian noise is evaluated and an analytical expression for the variance is provided. The simulations using LFM signals with Gaussian envelope testify that IF can be estimated accurately using the peak of WVD for four models of amplitude variation. Furthermore the statistical result of estimation on the signals with amplitude descending before rising is better than that of the signals with constant amplitude when the amplitude variation rate is moderate.
基金Project(61620106002)supported by the National Natural Science Foundation of ChinaProject(2016YFB0100906)supported by the National Key R&D Program in China+1 种基金Project(2015364X16030)supported by the Information Technology Research Project of Ministry of Transport of ChinaProject(2242015K42132)supported by the Fundamental Sciences of Southeast University,China
文摘This work correlated the detailed work zone location and time data from the Wis LCS system with the five-min inductive loop detector data. One-sample percentile value test and two-sample Kolmogorov-Smirnov(K-S) test were applied to compare the speed and flow characteristics between work zone and non-work zone conditions. Furthermore, we analyzed the mobility characteristics of freeway work zones within the urban area of Milwaukee, WI, USA. More than 50% of investigated work zones have experienced speed reduction and 15%-30% is necessary reduced volumes. Speed reduction was more significant within and at the downstream of work zones than at the upstream.
基金Project(61262035) supported by the National Natural Science Foundation of ChinaProjects(GJJ12271,GJJ12742) supported by the Science and Technology Foundation of Education Department of Jiangxi Province,ChinaProject(20122BAB201033) supported by the Natural Science Foundation of Jiangxi Province,China
文摘Head-driven statistical models for natural language parsing are the most representative lexicalized syntactic parsing models, but they only utilize semantic dependency between words, and do not incorporate other semantic information such as semantic collocation and semantic category. Some improvements on this distinctive parser are presented. Firstly, "valency" is an essential semantic feature of words. Once the valency of word is determined, the collocation of the word is clear, and the sentence structure can be directly derived. Thus, a syntactic parsing model combining valence structure with semantic dependency is purposed on the base of head-driven statistical syntactic parsing models. Secondly, semantic role labeling(SRL) is very necessary for deep natural language processing. An integrated parsing approach is proposed to integrate semantic parsing into the syntactic parsing process. Experiments are conducted for the refined statistical parser. The results show that 87.12% precision and 85.04% recall are obtained, and F measure is improved by 5.68% compared with the head-driven parsing model introduced by Collins.
文摘The basic"current"statistical model and adaptive Kalman filter algorithm can not track a weakly maneuvering target precisely,though it has good estimate accuracy for strongly maneuvering target.In order to solve this problem,a novel nonlinear fuzzy membership function was presented to adjust the upper and lower limit of target acceleration adaptively,and then the validity of the new algorithm for feeblish maneuvering target was proved in theory.At last,the computer simulation experiments indicated that the new algorithm has a great advantage over the basic"current"statistical model and adaptive algorithm.
文摘The Okiep Copper District is the oldest mining district in South Africa with a legacy of more than 150 years of mining.This legacy can be felt in the presence of large tailings dams scattered throughout the area.These tailings have a deleterious impact on the surrounding environment.To use geochemical methods in determining the scale of the impact, pre-mining background levels need to be determined. This is especially difficult in areas for which
基金Supported by the National Key R&D Plan(2018YFC1506500)Open Research Fund Project of Key Laboratory of Ecological Environment Meteorology of Qinling Mountains and Loess Plateau of Shaanxi Provincial Meteorological Bureau(2020Y-13)+1 种基金Open Research Fund of Shangluo Key Laboratory of Climate Adaptable City(SLSYS2022007)Shangluo Demonstration Project of Qinling Ecological Monitoring Service System(2020-611002-74-01-006200)。
文摘The study of land surface temperature(LST)is of great significance for ecosystem monitoring and ecological environmental protection in the Qinling Mountains of China.In view of the contradicting spatial and temporal resolutions in extracting LST from satellite remote sensing(RS)data,the areas with complex landforms of the Eastern Qinling Mountains were selected as the research targets to establish the correlation between the normalized difference vegetation index(NDVI)and LST.Detailed information on the surface features and temporal changes in the land surface was provided by Sentinel-2 and Sentinel-3,respectively.Based on the statistically downscaling method,the spatial scale could be decreased from 1000 m to 10 m,and LST with a Sentinel-3 temporal resolution and a 10 m spatial resolution could be retrieved.Comparing the 1 km resolution Sentinel-3 LST with the downscaling results,the 10 m LST downscaling data could accurately reflect the spatial distribution of the thermal characteristics of the original LST image.Moreover,the surface temperature data with a 10 m high spatial resolution had clear texture and obvious geomorphic features that could depict the detailed information of the ground features.The results showed that the average error was 5 K on April 16,2019 and 2.6 K on July 15,2019.The smaller error values indicated the higher vegetation coverage of summer downscaling result with the highest level on July 15.
基金Projects(51239005,51009072) supported by the National Natural Science Foundation of ChinaProject(2011BAF14B04) supported by the National Science&Technology Pillar Program of ChinaProject(13JDG084) supported by the Research Foundation for Advanced Talents of Jiansu University,China
文摘A three-dimensional transient numerical simulation was conducted to study the pressure fluctuations in low-specific-speed centrifugal pumps. The characteristics of the inner flow were investigated using the SST k-ω turbulence model. The distributions of pressure fluctuations in the impeller and the volute were recorded, and the pressure fluctuation intensity was analyzed comprehensively, at the design condition, using statistical methods. The results show that the pressure fluctuation intensity increases along the impeller streamline from the leading edge to the trailing edge. In the impeller passage, the intensity near the shroud is much higher than that near the hub at thc inlet. However, the intensity at the middle passage is almost equal to the intensity at the outlet. The pressure fluctuation intensity is the highest at the trailing edge on the pressure side and near the tongue because of the rotor-stator interaction. The distribution of pressure fluctuation intensity is symmetrical in the axial cross sections of the volute channel. However, this intensity decreases with increasing radial distance. Hence, the pressure fluctuation intensity can be reduced by modifying the geometry of the leading edge in the impeller and the tongue in the volute.
基金The authors appreciate the laboratory assistance of Goran Simundic and Michael Goodwin for assistance with measurement of the field test results The assistance of final year honours student Richard Szlicht is gratefully acknowledged.
文摘The paper describes field test results of 7.62×51 mm M61 AP(armour piercing) ammunition fired into mild steel targets at an outdoor range.The targets varied from 10 mm to 32 mm in thickness.The tests recorded penetration depth,probability of perforation(i.e.,complete penetration),muzzle and impact velocities,bullet mass,and plate yield strength and hardness.The measured penetration depth exhibited a variability of approximately±12%.The paper then compared ballistic test results with predictive models of steel penetration depth and thickness to prevent perforation.Statistical parameters were derived for muzzle and impact velocity,bullet mass,plate thickness,plate hardness,and model error.A Monte-Carlo probabilistic analysis was then developed to estimate the probability of plate perforation of 7.62 mm M61 AP ammunition for a range of impact velocities,and for mild steels,and High Hardness Armour(HHA) plates.This perforation fragility analysis considered the random variability of impact velocity,bullet mass,plate thickness,plate hardness,and model error.Such a probabilistic analysis allows for reliability-based design,where,for example,the plate thickness with 95% reliability(i.e.only 1 in 20 shots will penetrate the wall) can be estimated knowing the probabilistic distribution of perforation.Hence,it was found that the plate thickness to ensure a low 5% probability of perforation needs to be 11-15% thicker than required to have a 50/50 chance of perforation for mild steel plates.Plates would need to be 20-30% thicker if probability of perforation is reduced to zero.
基金Project(2013CB733605)supported by the National Basic Research Program of ChinaProject(21176073)supported by the National Natural Science Foundation of ChinaProject supported by the Fundamental Research Funds for the Central Universities,China
文摘A multivariate method for fault diagnosis and process monitoring is proposed. This technique is based on a statistical pattern(SP) framework integrated with a self-organizing map(SOM). An SP-based SOM is used as a classifier to distinguish various states on the output map, which can visually monitor abnormal states. A case study of the Tennessee Eastman(TE) process is presented to demonstrate the fault diagnosis and process monitoring performance of the proposed method. Results show that the SP-based SOM method is a visual tool for real-time monitoring and fault diagnosis that can be used in complex chemical processes.Compared with other SOM-based methods, the proposed method can more efficiently monitor and diagnose faults.
文摘A multifeature statistical image segmentation algorithm is described. Multiple features such as grey, edge magnitude and correlation are combined to form a multidimensional space statistics. The statistical algorithm is used to segment an image using the decision curved surface determined by the multidimensional feature function. The segmentation problem which is difficult to solve using the features independently will be readily solved using the same features jointly. An adaptive segmentation algorithm is discussed. Test results of the real-time TV tracker newly developed have shown that the segmentation algorithm discussed here improves effectively the image segmentation quality and system tracking performance.
基金Supported by the National Natural Science Foundation of China(12061017,12361055)the Research Fund of Guangxi Key Lab of Multi-source Information Mining&Security(22-A-01-01)。
文摘Existing blockwise empirical likelihood(BEL)method blocks the observations or their analogues,which is proven useful under some dependent data settings.In this paper,we introduce a new BEL(NBEL)method by blocking the scoring functions under high dimensional cases.We study the construction of confidence regions for the parameters in spatial autoregressive models with spatial autoregressive disturbances(SARAR models)with high dimension of parameters by using the NBEL method.It is shown that the NBEL ratio statistics are asymptoticallyχ^(2)-type distributed,which are used to obtain the NBEL based confidence regions for the parameters in SARAR models.A simulation study is conducted to compare the performances of the NBEL and the usual EL methods.
文摘Risk management often plays an important role in decision making un-der uncertainty.In quantitative risk management,assessing and optimizing risk metrics requires eficient computing techniques and reliable theoretical guarantees.In this pa-per,we introduce several topics on quantitative risk management and review some of the recent studies and advancements on the topics.We consider several risk metrics and study decision models that involve the metrics,with a main focus on the related com-puting techniques and theoretical properties.We show that stochastic optimization,as a powerful tool,can be leveraged to effectively address these problems.
基金supported by the Natural Science Foundation of Xinjiang Uygur Autonomous Region(Grant No.2022D01B 187)。
文摘Federated learning(FL)is a distributed machine learning paradigm for edge cloud computing.FL can facilitate data-driven decision-making in tactical scenarios,effectively addressing both data volume and infrastructure challenges in edge environments.However,the diversity of clients in edge cloud computing presents significant challenges for FL.Personalized federated learning(pFL)received considerable attention in recent years.One example of pFL involves exploiting the global and local information in the local model.Current pFL algorithms experience limitations such as slow convergence speed,catastrophic forgetting,and poor performance in complex tasks,which still have significant shortcomings compared to the centralized learning.To achieve high pFL performance,we propose FedCLCC:Federated Contrastive Learning and Conditional Computing.The core of FedCLCC is the use of contrastive learning and conditional computing.Contrastive learning determines the feature representation similarity to adjust the local model.Conditional computing separates the global and local information and feeds it to their corresponding heads for global and local handling.Our comprehensive experiments demonstrate that FedCLCC outperforms other state-of-the-art FL algorithms.
基金supported by the National Natural Science Foundation of China(12072090).
文摘This paper concentrates on addressing the hypersonic glide vehicle(HGV)tracking problem considering the high maneuverability and non-stationary heavy-tailed measurement noise without prior statistics in complicated flight environments.Since the interacting multiple model(IMM)filtering is famous with its ability to cover the movement property of motion models,the problem is formulated as modeling the non-stationary heavy-tailed measurement noise without any prior statistics in the IMM framework.Firstly,without any prior statistics,the Gaussian-inverse Wishart distribution is embedded in the improved Pearson type-VII(PTV)distribution,which can adaptively adjust the parameters to model the non-stationary heavytailed measurement noise.Besides,degree of freedom(DOF)parameters are surrogated by the maximization of evidence lower bound(ELBO)in the variational Bayesian optimization framework instead of fixed value to handle uncertain non-Gaussian degrees.Then,this paper analytically derives fusion forms based on the maximum Versoria fusion criterion instead of the moment matching approach,which can provide a precise approximation for the PTV mixture distribution in the mixing and output steps combined with the weight Kullback-Leibler average theory.Simulation results demonstrate the superiority and robustness of the proposed algorithm in typical HGVs tracking when the measurement noise without priori statistics is non-stationary.
基金Project(11272119)supported by the National Natural Science Foundation of China。
文摘This paper developed a statistical damage constitutive model for deep rock by considering the effects of external load and thermal treatment temperature based on the distortion energy.The model parameters were determined through the extremum features of stress−strain curve.Subsequently,the model predictions were compared with experimental results of marble samples.It is found that when the treatment temperature rises,the coupling damage evolution curve shows an S-shape and the slope of ascending branch gradually decreases during the coupling damage evolution process.At a constant temperature,confining pressure can suppress the expansion of micro-fractures.As the confining pressure increases the rock exhibits ductility characteristics,and the shape of coupling damage curve changes from an S-shape into a quasi-parabolic shape.This model can well characterize the influence of high temperature on the mechanical properties of deep rock and its brittleness-ductility transition characteristics under confining pressure.Also,it is suitable for sandstone and granite,especially in predicting the pre-peak stage and peak stress of stress−strain curve under the coupling action of confining pressure and high temperature.The relevant results can provide a reference for further research on the constitutive relationship of rock-like materials and their engineering applications.