A novel method for noise removal from the rotating accelerometer gravity gradiometer(MAGG)is presented.It introduces a head-to-tail data expansion technique based on the zero-phase filtering principle.A scheme for det...A novel method for noise removal from the rotating accelerometer gravity gradiometer(MAGG)is presented.It introduces a head-to-tail data expansion technique based on the zero-phase filtering principle.A scheme for determining band-pass filter parameters based on signal-to-noise ratio gain,smoothness index,and cross-correlation coefficient is designed using the Chebyshev optimal consistent approximation theory.Additionally,a wavelet denoising evaluation function is constructed,with the dmey wavelet basis function identified as most effective for processing gravity gradient data.The results of hard-in-the-loop simulation and prototype experiments show that the proposed processing method has shown a 14%improvement in the measurement variance of gravity gradient signals,and the measurement accuracy has reached within 4E,compared to other commonly used methods,which verifies that the proposed method effectively removes noise from the gradient signals,improved gravity gradiometry accuracy,and has certain technical insights for high-precision airborne gravity gradiometry.展开更多
Breakage is an important step in the resource processing chain.However,the mechanical crushing methods commonly used today suffer from low energy efficiency and high dust levels.Promoting environmental protection and ...Breakage is an important step in the resource processing chain.However,the mechanical crushing methods commonly used today suffer from low energy efficiency and high dust levels.Promoting environmental protection and improving energy efficiency are crucial to advancing China’s circular economy.Mining companies are actively exploring novel and innovative technologies to significantly cut down on operating costs and minimize emissions of dust and pollutants generated during processing.Recently,high voltage pulse discharge(HVPD)technology has received widespread attention and has been reported to have good application prospects in resource processing.This paper presents an extensive review of the operational principles of HVPD and the unique characteristics it engenders,such as non-polluting,selective material fragmentation,pre-weakening,pre-concentration,and enhanced permeability of coal seams.Additionally,this review explores the potential and obstacles confronting HVPD in industrial contexts,offering fresh insights for HVPD optimization and providing guidance and prospects for industrial deployment and further development.展开更多
Laser powder-bed fusion(LPBF)of Zn-0.8Cu(wt.%)alloys exhibits significant advantages in the customization of biodegradable bone implants.However,the formability of LPBFed Zn alloy is not sufficient due to the spheroid...Laser powder-bed fusion(LPBF)of Zn-0.8Cu(wt.%)alloys exhibits significant advantages in the customization of biodegradable bone implants.However,the formability of LPBFed Zn alloy is not sufficient due to the spheroidization during the interaction of powder and laser beam,of which the mechanism is still not well understood.In this study,the evolution of morphology and grain structure of the LPBFed Zn-Cu alloy was investigated based on single-track deposition experiments.As the scanning speed increases,the grain structure of a single track of Zn-Cu alloy gradually refines,but the formability deteriorates,leading to the defect’s formation in the subsequent fabrication.The Zn-Cu alloys fabricated by optimum processing parameters exhibit a tensile strength of 157.13 MPa,yield strength of 106.48 MPa and elongation of 14.7%.This work provides a comprehensive understanding of the processing optimization of Zn-Cu alloy,achieving LPBFed Zn-Cu alloy with high density and excellent mechanical properties.展开更多
Data processing of small samples is an important and valuable research problem in the electronic equipment test. Because it is difficult and complex to determine the probability distribution of small samples, it is di...Data processing of small samples is an important and valuable research problem in the electronic equipment test. Because it is difficult and complex to determine the probability distribution of small samples, it is difficult to use the traditional probability theory to process the samples and assess the degree of uncertainty. Using the grey relational theory and the norm theory, the grey distance information approach, which is based on the grey distance information quantity of a sample and the average grey distance information quantity of the samples, is proposed in this article. The definitions of the grey distance information quantity of a sample and the average grey distance information quantity of the samples, with their characteristics and algorithms, are introduced. The correlative problems, including the algorithm of estimated value, the standard deviation, and the acceptance and rejection criteria of the samples and estimated results, are also proposed. Moreover, the information whitening ratio is introduced to select the weight algorithm and to compare the different samples. Several examples are given to demonstrate the application of the proposed approach. The examples show that the proposed approach, which has no demand for the probability distribution of small samples, is feasible and effective.展开更多
In non-homogeneous environment, traditional space-time adaptive processing doesn't effectively suppress interference and detect target, because the secondary data don' t exactly reflect the statistical characteristi...In non-homogeneous environment, traditional space-time adaptive processing doesn't effectively suppress interference and detect target, because the secondary data don' t exactly reflect the statistical characteristic of the range cell under test. A ravel methodology utilizing the direct data domain approach to space-time adaptive processing ( STAP ) in airbome radar non-homogeneous environments is presented. The deterministic least squares adaptive signal processing technique operates on a "snapshot-by-snapshot" basis to dethrone the adaptive adaptive weights for nulling interferences and estimating signal of interest (SOI). Furthermore, this approach eliminates the requirement for estimating the covariance through the data of neighboring range cell, which eliminates calculating the inverse of covariance, and can be implemented to operate in real-time. Simulation results illustrate the efficiency of interference suppression in non-homogeneous environment.展开更多
Due to the limited scenes that synthetic aperture radar(SAR)satellites can detect,the full-track utilization rate is not high.Because of the computing and storage limitation of one satellite,it is difficult to process...Due to the limited scenes that synthetic aperture radar(SAR)satellites can detect,the full-track utilization rate is not high.Because of the computing and storage limitation of one satellite,it is difficult to process large amounts of data of spaceborne synthetic aperture radars.It is proposed to use a new method of networked satellite data processing for improving the efficiency of data processing.A multi-satellite distributed SAR real-time processing method based on Chirp Scaling(CS)imaging algorithm is studied in this paper,and a distributed data processing system is built with field programmable gate array(FPGA)chips as the kernel.Different from the traditional CS algorithm processing,the system divides data processing into three stages.The computing tasks are reasonably allocated to different data processing units(i.e.,satellites)in each stage.The method effectively saves computing and storage resources of satellites,improves the utilization rate of a single satellite,and shortens the data processing time.Gaofen-3(GF-3)satellite SAR raw data is processed by the system,with the performance of the method verified.展开更多
Due to the increasing number of cloud applications,the amount of data in the cloud shows signs of growing faster than ever before.The nature of cloud computing requires cloud data processing systems that can handle hu...Due to the increasing number of cloud applications,the amount of data in the cloud shows signs of growing faster than ever before.The nature of cloud computing requires cloud data processing systems that can handle huge volumes of data and have high performance.However,most cloud storage systems currently adopt a hash-like approach to retrieving data that only supports simple keyword-based enquiries,but lacks various forms of information search.Therefore,a scalable and efficient indexing scheme is clearly required.In this paper,we present a skip list-based cloud index,called SLC-index,which is a novel,scalable skip list-based indexing for cloud data processing.The SLC-index offers a two-layered architecture for extending indexing scope and facilitating better throughput.Dynamic load-balancing for the SLC-index is achieved by online migration of index nodes between servers.Furthermore,it is a flexible system due to its dynamic addition and removal of servers.The SLC-index is efficient for both point and range queries.Experimental results show the efficiency of the SLC-index and its usefulness as an alternative approach for cloud-suitable data structures.展开更多
In this paper, an estimation method for reliability parameter in the case of zero-failuare data-synthetic estimation method is given. For zero-failure data of double-parameter exponential distribution, a hierarchical ...In this paper, an estimation method for reliability parameter in the case of zero-failuare data-synthetic estimation method is given. For zero-failure data of double-parameter exponential distribution, a hierarchical Bayesian estimation of the failure probability is presented. After failure information is introduced, hierarchical Bayesian estimation and synthetic estimation of the failure probability, as well as synthetic estimation of reliability are given. Calculation and analysis are performed regarding practical problems in case that life distribution of an engine obeys double-parameter exponential distribution.展开更多
Beam-hopping technology has become one of the major research hotspots for satellite communication in order to enhance their communication capacity and flexibility.However,beam hopping causes the traditional continuous...Beam-hopping technology has become one of the major research hotspots for satellite communication in order to enhance their communication capacity and flexibility.However,beam hopping causes the traditional continuous time-division multiplexing signal in the forward downlink to become a burst signal,satellite terminal receivers need to solve multiple key issues such as burst signal rapid synchronization and high-per-formance reception.Firstly,this paper analyzes the key issues of burst communication for traffic signals in beam hopping sys-tems,and then compares and studies typical carrier synchro-nization algorithms for burst signals.Secondly,combining the requirements of beam-hopping communication systems for effi-cient burst and low signal-to-noise ratio reception of downlink signals in forward links,a decoding assisted bidirectional vari-able parameter iterative carrier synchronization technique is pro-posed,which introduces the idea of iterative processing into car-rier synchronization.Aiming at the technical characteristics of communication signal carrier synchronization,a new technical approach of bidirectional variable parameter iteration is adopted,breaking through the traditional understanding that loop struc-tures cannot adapt to low signal-to-noise ratio burst demodula-tion.Finally,combining the DVB-S2X standard physical layer frame format used in high throughput satellite communication systems,the research and performance simulation are con-ducted.The results show that the new technology proposed in this paper can significantly shorten the carrier synchronization time of burst signals,achieve fast synchronization of low signal-to-noise ratio burst signals,and have the unique advantage of flexible and adjustable parameters.展开更多
Through analyzing the needs of seismic data processing and interpretation,a system model based on CSCW is designed.Using the technology of CSCW to build the environment of cooperative work allows the field data acquis...Through analyzing the needs of seismic data processing and interpretation,a system model based on CSCW is designed.Using the technology of CSCW to build the environment of cooperative work allows the field data acquisition to possess the functions of remote real-time guidance by experts and remote real-time processing of the data.The model overcomes the influences and barriers existing in the areas展开更多
Distributed/parallel-processing system like sun grid engine(SGE) that utilizes multiple nodes/cores is proposed for the faster processing of large sized satellite image data. After verification, distributed process en...Distributed/parallel-processing system like sun grid engine(SGE) that utilizes multiple nodes/cores is proposed for the faster processing of large sized satellite image data. After verification, distributed process environment for pre-processing performance can be improved by up to 560.65% from single processing system. Through this, analysis performance in various fields can be improved, and moreover, near-real time service can be achieved in near future.展开更多
This paper selected the corn processing industry technology innovation alliance in Heilongjiang Province as the research object, evaluated the operational performance of the alliance by using analytic hierarchy proces...This paper selected the corn processing industry technology innovation alliance in Heilongjiang Province as the research object, evaluated the operational performance of the alliance by using analytic hierarchy process(AHP) and fuzzy comprehensive evaluation methods. AHP empirical results showed that the satisfaction of information communication and the satisfaction of the management process were the weakest. And the order from high to low on the level of indicators of the impact for the alliance was the result of alliance operations and the process of alliance operations, the behavioral attitude of alliance members. Besides, the results of fuzzy comprehensive evaluation showed that the operational performance of the corn processing industry technology innovation alliance in Heilongjiang Province was in the general level.展开更多
Estimating trawler fishing effort plays a critical role in characterizing marine fisheries activities,quantifying the ecological impact of trawling,and refining regulatory frameworks and policies.Understanding trawler...Estimating trawler fishing effort plays a critical role in characterizing marine fisheries activities,quantifying the ecological impact of trawling,and refining regulatory frameworks and policies.Understanding trawler fishing inputs offers crucial scientific data to support the sustainable management of offshore fishery resources in China.An XGBoost algorithm was introduced and optimized through Harris Hawks Optimization(HHO),to develop a model for identifying trawler fishing behaviour.The model demonstrated exceptional performance,achieving accuracy,sensitivity,specificity,and the Matthews correlation coefficient of 0.9713,0.9806,0.9632,and 0.9425,respectively.Using this model to detect fishing activities,the fishing effort of trawlers from Shandong Province in the sea area between 119°E to 124°E and 32°N to 40°N in 2021 was quantified.A heatmap depicting fishing effort,generated with a spatial resolution of 1/8°,revealed that fishing activities were predominantly concentrated in two regions:121.1°E to 124°E,35.7°N to 38.7°N,and 119.8°E to 122.8°E,33.6°N to 35.4°N.This research can provide a foundation for quantitative evaluations of fishery resources,which can offer vital data to promote the sustainable development of marine capture fisheries.展开更多
Applying bio-oxidation waste solution(BOS)to chemical-biological two-stage oxidation process can significantly improve the bio-oxidation efficiency of arsenopyrite.This study aims to clarify the enhanced oxidation mec...Applying bio-oxidation waste solution(BOS)to chemical-biological two-stage oxidation process can significantly improve the bio-oxidation efficiency of arsenopyrite.This study aims to clarify the enhanced oxidation mechanism of arsenopyrite by evaluating the effects of physical and chemical changes of arsenopyrite in BOS chemical oxidation stage on mineral dissolution kinetics,as well as microbial growth activity and community structure composition in bio-oxidation stage.The results showed that the chemical oxidation contributed to destroying the physical and chemical structure of arsenopyrite surface and reducing the particle size,and led to the formation of nitrogenous substances on mineral surface.These chemical oxidation behaviors effectively promoted Fe^(3+)cycling in the bio-oxidation system and weakened the inhibitory effect of the sulfur film on ionic diffusion,thereby enhancing the dissolution kinetics of the arsenopyrite.Therefore,the bio-oxidation efficiency of arsenopyrite was significantly increased in the two-stage oxidation process.After 18 d,the two-stage oxidation process achieved total extraction rates of(88.8±2.0)%,(86.7±1.3)%,and(74.7±3.0)%for As,Fe,and S elements,respectively.These values represented a significant increase of(50.8±3.4)%,(47.1±2.7)%,and(46.0±0.7)%,respectively,compared to the one-stage bio-oxidation process.展开更多
By combining with an improved model on engraving process,a two-phase flow interior ballistic model has been proposed to accurately predict the flow and energy conversion behaviors of pyrotechnic actuators.Using comput...By combining with an improved model on engraving process,a two-phase flow interior ballistic model has been proposed to accurately predict the flow and energy conversion behaviors of pyrotechnic actuators.Using computational fluid dynamics(CFD),the two-phase flow and piston engraving characteristics of a pyrotechnic actuator are investigated.Initially,the current model was utilized to examine the intricate,multi-dimensional flow,and energy conversion characteristics of the propellant grains and combustion gas within the pyrotechnic actuator chamber.It was discovered that the combustion gas on the wall's constant transition from potential to kinetic energy,along with the combined effect of the propellant motion,are what create the pressure oscillation within the chamber.Additionally,a numerical analysis was conducted to determine the impact of various parameters on the pressure oscillation and piston motion,including pyrotechnic charge,pyrotechnic particle size,and chamber structural dimension.The findings show that decreasing the pyrotechnic charge will lower the terminal velocity,while increasing and decreasing the pyrotechnic particle size will reduce the pressure oscillation in the chamber.The pyrotechnic particle size has minimal bearing on the terminal velocity.The results of this investigation offer a trustworthy forecasting instrument for comprehending and creating pyrotechnic actuator designs.展开更多
Heterogeneous federated learning(HtFL)has gained significant attention due to its ability to accommodate diverse models and data from distributed combat units.The prototype-based HtFL methods were proposed to reduce t...Heterogeneous federated learning(HtFL)has gained significant attention due to its ability to accommodate diverse models and data from distributed combat units.The prototype-based HtFL methods were proposed to reduce the high communication cost of transmitting model parameters.These methods allow for the sharing of only class representatives between heterogeneous clients while maintaining privacy.However,existing prototype learning approaches fail to take the data distribution of clients into consideration,which results in suboptimal global prototype learning and insufficient client model personalization capabilities.To address these issues,we propose a fair trainable prototype federated learning(FedFTP)algorithm,which employs a fair sampling training prototype(FSTP)mechanism and a hyperbolic space constraints(HSC)mechanism to enhance the fairness and effectiveness of prototype learning on the server in heterogeneous environments.Furthermore,a local prototype stable update(LPSU)mechanism is proposed as a means of maintaining personalization while promoting global consistency,based on contrastive learning.Comprehensive experimental results demonstrate that FedFTP achieves state-of-the-art performance in HtFL scenarios.展开更多
Recently, high-precision trajectory prediction of ballistic missiles in the boost phase has become a research hotspot. This paper proposes a trajectory prediction algorithm driven by data and knowledge(DKTP) to solve ...Recently, high-precision trajectory prediction of ballistic missiles in the boost phase has become a research hotspot. This paper proposes a trajectory prediction algorithm driven by data and knowledge(DKTP) to solve this problem. Firstly, the complex dynamics characteristics of ballistic missile in the boost phase are analyzed in detail. Secondly, combining the missile dynamics model with the target gravity turning model, a knowledge-driven target three-dimensional turning(T3) model is derived. Then, the BP neural network is used to train the boost phase trajectory database in typical scenarios to obtain a datadriven state parameter mapping(SPM) model. On this basis, an online trajectory prediction framework driven by data and knowledge is established. Based on the SPM model, the three-dimensional turning coefficients of the target are predicted by using the current state of the target, and the state of the target at the next moment is obtained by combining the T3 model. Finally, simulation verification is carried out under various conditions. The simulation results show that the DKTP algorithm combines the advantages of data-driven and knowledge-driven, improves the interpretability of the algorithm, reduces the uncertainty, which can achieve high-precision trajectory prediction of ballistic missile in the boost phase.展开更多
The gears of new energy vehicles are required to withstand higher rotational speeds and greater loads,which puts forward higher precision essentials for gear manufacturing.However,machining process parameters can caus...The gears of new energy vehicles are required to withstand higher rotational speeds and greater loads,which puts forward higher precision essentials for gear manufacturing.However,machining process parameters can cause changes in cutting force/heat,resulting in affecting gear machining precision.Therefore,this paper studies the effect of different process parameters on gear machining precision.A multi-objective optimization model is established for the relationship between process parameters and tooth surface deviations,tooth profile deviations,and tooth lead deviations through the cutting speed,feed rate,and cutting depth of the worm wheel gear grinding machine.The response surface method(RSM)is used for experimental design,and the corresponding experimental results and optimal process parameters are obtained.Subsequently,gray relational analysis-principal component analysis(GRA-PCA),particle swarm optimization(PSO),and genetic algorithm-particle swarm optimization(GA-PSO)methods are used to analyze the experimental results and obtain different optimal process parameters.The results show that optimal process parameters obtained by the GRA-PCA,PSO,and GA-PSO methods improve the gear machining precision.Moreover,the gear machining precision obtained by GA-PSO is superior to other methods.展开更多
This study presents a machine learning-based method for predicting fragment velocity distribution in warhead fragmentation under explosive loading condition.The fragment resultant velocities are correlated with key de...This study presents a machine learning-based method for predicting fragment velocity distribution in warhead fragmentation under explosive loading condition.The fragment resultant velocities are correlated with key design parameters including casing dimensions and detonation positions.The paper details the finite element analysis for fragmentation,the characterizations of the dynamic hardening and fracture models,the generation of comprehensive datasets,and the training of the ANN model.The results show the influence of casing dimensions on fragment velocity distributions,with the tendencies indicating increased resultant velocity with reduced thickness,increased length and diameter.The model's predictive capability is demonstrated through the accurate predictions for both training and testing datasets,showing its potential for the real-time prediction of fragmentation performance.展开更多
[Objective]Accurate prediction of tomato growth height is crucial for optimizing production environments in smart farming.However,current prediction methods predominantly rely on empirical,mechanistic,or learning-base...[Objective]Accurate prediction of tomato growth height is crucial for optimizing production environments in smart farming.However,current prediction methods predominantly rely on empirical,mechanistic,or learning-based models that utilize either images data or environmental data.These methods fail to fully leverage multi-modal data to capture the diverse aspects of plant growth comprehensively.[Methods]To address this limitation,a two-stage phenotypic feature extraction(PFE)model based on deep learning algorithm of recurrent neural network(RNN)and long short-term memory(LSTM)was developed.The model integrated environment and plant information to provide a holistic understanding of the growth process,emploied phenotypic and temporal feature extractors to comprehensively capture both types of features,enabled a deeper understanding of the interaction between tomato plants and their environment,ultimately leading to highly accurate predictions of growth height.[Results and Discussions]The experimental results showed the model's ef‐fectiveness:When predicting the next two days based on the past five days,the PFE-based RNN and LSTM models achieved mean absolute percentage error(MAPE)of 0.81%and 0.40%,respectively,which were significantly lower than the 8.00%MAPE of the large language model(LLM)and 6.72%MAPE of the Transformer-based model.In longer-term predictions,the 10-day prediction for 4 days ahead and the 30-day prediction for 12 days ahead,the PFE-RNN model continued to outperform the other two baseline models,with MAPE of 2.66%and 14.05%,respectively.[Conclusions]The proposed method,which leverages phenotypic-temporal collaboration,shows great potential for intelligent,data-driven management of tomato cultivation,making it a promising approach for enhancing the efficiency and precision of smart tomato planting management.展开更多
文摘A novel method for noise removal from the rotating accelerometer gravity gradiometer(MAGG)is presented.It introduces a head-to-tail data expansion technique based on the zero-phase filtering principle.A scheme for determining band-pass filter parameters based on signal-to-noise ratio gain,smoothness index,and cross-correlation coefficient is designed using the Chebyshev optimal consistent approximation theory.Additionally,a wavelet denoising evaluation function is constructed,with the dmey wavelet basis function identified as most effective for processing gravity gradient data.The results of hard-in-the-loop simulation and prototype experiments show that the proposed processing method has shown a 14%improvement in the measurement variance of gravity gradient signals,and the measurement accuracy has reached within 4E,compared to other commonly used methods,which verifies that the proposed method effectively removes noise from the gradient signals,improved gravity gradiometry accuracy,and has certain technical insights for high-precision airborne gravity gradiometry.
基金Foundation item:Project(2023YFC2909000) supported by the National Key R&D Program for Young Scientists,ChinaProject(2023JH3/10200010) supported by the Excellent Youth Natural Science Foundation of Liaoning Province,China+3 种基金Project (XLYC2203167) supported by the Liaoning Revitalization Talents Program,ChinaProject(RC231175) supported by the Mid-career and Young Scientific and Technological Talents Program of Shenyang,ChinaProject(2023A03003-2) supported by the Key Special Program of Xinjiang,ChinaProject(N2301026) supported by the Fundamental Research Funds for the Central Universities,China。
文摘Breakage is an important step in the resource processing chain.However,the mechanical crushing methods commonly used today suffer from low energy efficiency and high dust levels.Promoting environmental protection and improving energy efficiency are crucial to advancing China’s circular economy.Mining companies are actively exploring novel and innovative technologies to significantly cut down on operating costs and minimize emissions of dust and pollutants generated during processing.Recently,high voltage pulse discharge(HVPD)technology has received widespread attention and has been reported to have good application prospects in resource processing.This paper presents an extensive review of the operational principles of HVPD and the unique characteristics it engenders,such as non-polluting,selective material fragmentation,pre-weakening,pre-concentration,and enhanced permeability of coal seams.Additionally,this review explores the potential and obstacles confronting HVPD in industrial contexts,offering fresh insights for HVPD optimization and providing guidance and prospects for industrial deployment and further development.
基金Project(2022YFC2406000)supported by the National Key R&D Program,ChinaProject(2022GDASZH-2022010107)supported by the Guangdong Academy of Science,China+4 种基金Project(2019BT02C629)supported by the Guangdong Special Support Program,ChinaProject(2022GDASZH-2022010203-003)supported by the GDAS’project of Science and Technology Development,ChinaProjects(2023B1212120008,2023B1212060045)supported by the Guangdong Province Science and Technology Plan Projects,ChinaProject(2023TQ07Z559)supported by the Special Support Foundation of Guangdong Province,ChinaProject(52105293)supported by the National Natural Science Foundation of China。
文摘Laser powder-bed fusion(LPBF)of Zn-0.8Cu(wt.%)alloys exhibits significant advantages in the customization of biodegradable bone implants.However,the formability of LPBFed Zn alloy is not sufficient due to the spheroidization during the interaction of powder and laser beam,of which the mechanism is still not well understood.In this study,the evolution of morphology and grain structure of the LPBFed Zn-Cu alloy was investigated based on single-track deposition experiments.As the scanning speed increases,the grain structure of a single track of Zn-Cu alloy gradually refines,but the formability deteriorates,leading to the defect’s formation in the subsequent fabrication.The Zn-Cu alloys fabricated by optimum processing parameters exhibit a tensile strength of 157.13 MPa,yield strength of 106.48 MPa and elongation of 14.7%.This work provides a comprehensive understanding of the processing optimization of Zn-Cu alloy,achieving LPBFed Zn-Cu alloy with high density and excellent mechanical properties.
文摘Data processing of small samples is an important and valuable research problem in the electronic equipment test. Because it is difficult and complex to determine the probability distribution of small samples, it is difficult to use the traditional probability theory to process the samples and assess the degree of uncertainty. Using the grey relational theory and the norm theory, the grey distance information approach, which is based on the grey distance information quantity of a sample and the average grey distance information quantity of the samples, is proposed in this article. The definitions of the grey distance information quantity of a sample and the average grey distance information quantity of the samples, with their characteristics and algorithms, are introduced. The correlative problems, including the algorithm of estimated value, the standard deviation, and the acceptance and rejection criteria of the samples and estimated results, are also proposed. Moreover, the information whitening ratio is introduced to select the weight algorithm and to compare the different samples. Several examples are given to demonstrate the application of the proposed approach. The examples show that the proposed approach, which has no demand for the probability distribution of small samples, is feasible and effective.
文摘In non-homogeneous environment, traditional space-time adaptive processing doesn't effectively suppress interference and detect target, because the secondary data don' t exactly reflect the statistical characteristic of the range cell under test. A ravel methodology utilizing the direct data domain approach to space-time adaptive processing ( STAP ) in airbome radar non-homogeneous environments is presented. The deterministic least squares adaptive signal processing technique operates on a "snapshot-by-snapshot" basis to dethrone the adaptive adaptive weights for nulling interferences and estimating signal of interest (SOI). Furthermore, this approach eliminates the requirement for estimating the covariance through the data of neighboring range cell, which eliminates calculating the inverse of covariance, and can be implemented to operate in real-time. Simulation results illustrate the efficiency of interference suppression in non-homogeneous environment.
基金Project(2017YFC1405600)supported by the National Key R&D Program of ChinaProject(18JK05032)supported by the Scientific Research Project of Education Department of Shaanxi Province,China。
文摘Due to the limited scenes that synthetic aperture radar(SAR)satellites can detect,the full-track utilization rate is not high.Because of the computing and storage limitation of one satellite,it is difficult to process large amounts of data of spaceborne synthetic aperture radars.It is proposed to use a new method of networked satellite data processing for improving the efficiency of data processing.A multi-satellite distributed SAR real-time processing method based on Chirp Scaling(CS)imaging algorithm is studied in this paper,and a distributed data processing system is built with field programmable gate array(FPGA)chips as the kernel.Different from the traditional CS algorithm processing,the system divides data processing into three stages.The computing tasks are reasonably allocated to different data processing units(i.e.,satellites)in each stage.The method effectively saves computing and storage resources of satellites,improves the utilization rate of a single satellite,and shortens the data processing time.Gaofen-3(GF-3)satellite SAR raw data is processed by the system,with the performance of the method verified.
基金Projects(61363021,61540061,61663047)supported by the National Natural Science Foundation of ChinaProject(2017SE206)supported by the Open Foundation of Key Laboratory in Software Engineering of Yunnan Province,China
文摘Due to the increasing number of cloud applications,the amount of data in the cloud shows signs of growing faster than ever before.The nature of cloud computing requires cloud data processing systems that can handle huge volumes of data and have high performance.However,most cloud storage systems currently adopt a hash-like approach to retrieving data that only supports simple keyword-based enquiries,but lacks various forms of information search.Therefore,a scalable and efficient indexing scheme is clearly required.In this paper,we present a skip list-based cloud index,called SLC-index,which is a novel,scalable skip list-based indexing for cloud data processing.The SLC-index offers a two-layered architecture for extending indexing scope and facilitating better throughput.Dynamic load-balancing for the SLC-index is achieved by online migration of index nodes between servers.Furthermore,it is a flexible system due to its dynamic addition and removal of servers.The SLC-index is efficient for both point and range queries.Experimental results show the efficiency of the SLC-index and its usefulness as an alternative approach for cloud-suitable data structures.
文摘In this paper, an estimation method for reliability parameter in the case of zero-failuare data-synthetic estimation method is given. For zero-failure data of double-parameter exponential distribution, a hierarchical Bayesian estimation of the failure probability is presented. After failure information is introduced, hierarchical Bayesian estimation and synthetic estimation of the failure probability, as well as synthetic estimation of reliability are given. Calculation and analysis are performed regarding practical problems in case that life distribution of an engine obeys double-parameter exponential distribution.
基金This work was supported by the Key Research and Development Program of Shaanxi(2022ZDLGY05-08)the Application Innovation Program of CASC(China Aerospace Science and Technology Corporation)(6230107001)+2 种基金the Research Project on Civil Aerospace Technology(D040304)the Research Project of CAST(Y23-WYHXJS-07)the Research Foundation of the Key Laboratory of Spaceborne Information Intelligent Interpretation(2022-ZZKY-JJ-20-01).
文摘Beam-hopping technology has become one of the major research hotspots for satellite communication in order to enhance their communication capacity and flexibility.However,beam hopping causes the traditional continuous time-division multiplexing signal in the forward downlink to become a burst signal,satellite terminal receivers need to solve multiple key issues such as burst signal rapid synchronization and high-per-formance reception.Firstly,this paper analyzes the key issues of burst communication for traffic signals in beam hopping sys-tems,and then compares and studies typical carrier synchro-nization algorithms for burst signals.Secondly,combining the requirements of beam-hopping communication systems for effi-cient burst and low signal-to-noise ratio reception of downlink signals in forward links,a decoding assisted bidirectional vari-able parameter iterative carrier synchronization technique is pro-posed,which introduces the idea of iterative processing into car-rier synchronization.Aiming at the technical characteristics of communication signal carrier synchronization,a new technical approach of bidirectional variable parameter iteration is adopted,breaking through the traditional understanding that loop struc-tures cannot adapt to low signal-to-noise ratio burst demodula-tion.Finally,combining the DVB-S2X standard physical layer frame format used in high throughput satellite communication systems,the research and performance simulation are con-ducted.The results show that the new technology proposed in this paper can significantly shorten the carrier synchronization time of burst signals,achieve fast synchronization of low signal-to-noise ratio burst signals,and have the unique advantage of flexible and adjustable parameters.
文摘Through analyzing the needs of seismic data processing and interpretation,a system model based on CSCW is designed.Using the technology of CSCW to build the environment of cooperative work allows the field data acquisition to possess the functions of remote real-time guidance by experts and remote real-time processing of the data.The model overcomes the influences and barriers existing in the areas
基金supported by the Sharing and Diffusion of National R&D Outcome funded by the Korea Institute of Science and Technology Information
文摘Distributed/parallel-processing system like sun grid engine(SGE) that utilizes multiple nodes/cores is proposed for the faster processing of large sized satellite image data. After verification, distributed process environment for pre-processing performance can be improved by up to 560.65% from single processing system. Through this, analysis performance in various fields can be improved, and moreover, near-real time service can be achieved in near future.
基金Supported by Technology Research and Development Project of Heilongjiang Province(GB14D202)
文摘This paper selected the corn processing industry technology innovation alliance in Heilongjiang Province as the research object, evaluated the operational performance of the alliance by using analytic hierarchy process(AHP) and fuzzy comprehensive evaluation methods. AHP empirical results showed that the satisfaction of information communication and the satisfaction of the management process were the weakest. And the order from high to low on the level of indicators of the impact for the alliance was the result of alliance operations and the process of alliance operations, the behavioral attitude of alliance members. Besides, the results of fuzzy comprehensive evaluation showed that the operational performance of the corn processing industry technology innovation alliance in Heilongjiang Province was in the general level.
文摘Estimating trawler fishing effort plays a critical role in characterizing marine fisheries activities,quantifying the ecological impact of trawling,and refining regulatory frameworks and policies.Understanding trawler fishing inputs offers crucial scientific data to support the sustainable management of offshore fishery resources in China.An XGBoost algorithm was introduced and optimized through Harris Hawks Optimization(HHO),to develop a model for identifying trawler fishing behaviour.The model demonstrated exceptional performance,achieving accuracy,sensitivity,specificity,and the Matthews correlation coefficient of 0.9713,0.9806,0.9632,and 0.9425,respectively.Using this model to detect fishing activities,the fishing effort of trawlers from Shandong Province in the sea area between 119°E to 124°E and 32°N to 40°N in 2021 was quantified.A heatmap depicting fishing effort,generated with a spatial resolution of 1/8°,revealed that fishing activities were predominantly concentrated in two regions:121.1°E to 124°E,35.7°N to 38.7°N,and 119.8°E to 122.8°E,33.6°N to 35.4°N.This research can provide a foundation for quantitative evaluations of fishery resources,which can offer vital data to promote the sustainable development of marine capture fisheries.
基金Project(52274348)supported by the National Natural Science Foundation of ChinaProject(2022JH1/10400024)supported by the Major Projects for the“Revealed Top”Science and Technology of Liaoning Province,China。
文摘Applying bio-oxidation waste solution(BOS)to chemical-biological two-stage oxidation process can significantly improve the bio-oxidation efficiency of arsenopyrite.This study aims to clarify the enhanced oxidation mechanism of arsenopyrite by evaluating the effects of physical and chemical changes of arsenopyrite in BOS chemical oxidation stage on mineral dissolution kinetics,as well as microbial growth activity and community structure composition in bio-oxidation stage.The results showed that the chemical oxidation contributed to destroying the physical and chemical structure of arsenopyrite surface and reducing the particle size,and led to the formation of nitrogenous substances on mineral surface.These chemical oxidation behaviors effectively promoted Fe^(3+)cycling in the bio-oxidation system and weakened the inhibitory effect of the sulfur film on ionic diffusion,thereby enhancing the dissolution kinetics of the arsenopyrite.Therefore,the bio-oxidation efficiency of arsenopyrite was significantly increased in the two-stage oxidation process.After 18 d,the two-stage oxidation process achieved total extraction rates of(88.8±2.0)%,(86.7±1.3)%,and(74.7±3.0)%for As,Fe,and S elements,respectively.These values represented a significant increase of(50.8±3.4)%,(47.1±2.7)%,and(46.0±0.7)%,respectively,compared to the one-stage bio-oxidation process.
基金supported by the National Natural Science Foundation of China(Grant No.11972194).
文摘By combining with an improved model on engraving process,a two-phase flow interior ballistic model has been proposed to accurately predict the flow and energy conversion behaviors of pyrotechnic actuators.Using computational fluid dynamics(CFD),the two-phase flow and piston engraving characteristics of a pyrotechnic actuator are investigated.Initially,the current model was utilized to examine the intricate,multi-dimensional flow,and energy conversion characteristics of the propellant grains and combustion gas within the pyrotechnic actuator chamber.It was discovered that the combustion gas on the wall's constant transition from potential to kinetic energy,along with the combined effect of the propellant motion,are what create the pressure oscillation within the chamber.Additionally,a numerical analysis was conducted to determine the impact of various parameters on the pressure oscillation and piston motion,including pyrotechnic charge,pyrotechnic particle size,and chamber structural dimension.The findings show that decreasing the pyrotechnic charge will lower the terminal velocity,while increasing and decreasing the pyrotechnic particle size will reduce the pressure oscillation in the chamber.The pyrotechnic particle size has minimal bearing on the terminal velocity.The results of this investigation offer a trustworthy forecasting instrument for comprehending and creating pyrotechnic actuator designs.
基金supported by the Natural Science Foundation of Xinjiang Uygur Autonomous Region(No.2022D01B187).
文摘Heterogeneous federated learning(HtFL)has gained significant attention due to its ability to accommodate diverse models and data from distributed combat units.The prototype-based HtFL methods were proposed to reduce the high communication cost of transmitting model parameters.These methods allow for the sharing of only class representatives between heterogeneous clients while maintaining privacy.However,existing prototype learning approaches fail to take the data distribution of clients into consideration,which results in suboptimal global prototype learning and insufficient client model personalization capabilities.To address these issues,we propose a fair trainable prototype federated learning(FedFTP)algorithm,which employs a fair sampling training prototype(FSTP)mechanism and a hyperbolic space constraints(HSC)mechanism to enhance the fairness and effectiveness of prototype learning on the server in heterogeneous environments.Furthermore,a local prototype stable update(LPSU)mechanism is proposed as a means of maintaining personalization while promoting global consistency,based on contrastive learning.Comprehensive experimental results demonstrate that FedFTP achieves state-of-the-art performance in HtFL scenarios.
基金the National Natural Science Foundation of China (Grants No. 12072090 and No.12302056) to provide fund for conducting experiments。
文摘Recently, high-precision trajectory prediction of ballistic missiles in the boost phase has become a research hotspot. This paper proposes a trajectory prediction algorithm driven by data and knowledge(DKTP) to solve this problem. Firstly, the complex dynamics characteristics of ballistic missile in the boost phase are analyzed in detail. Secondly, combining the missile dynamics model with the target gravity turning model, a knowledge-driven target three-dimensional turning(T3) model is derived. Then, the BP neural network is used to train the boost phase trajectory database in typical scenarios to obtain a datadriven state parameter mapping(SPM) model. On this basis, an online trajectory prediction framework driven by data and knowledge is established. Based on the SPM model, the three-dimensional turning coefficients of the target are predicted by using the current state of the target, and the state of the target at the next moment is obtained by combining the T3 model. Finally, simulation verification is carried out under various conditions. The simulation results show that the DKTP algorithm combines the advantages of data-driven and knowledge-driven, improves the interpretability of the algorithm, reduces the uncertainty, which can achieve high-precision trajectory prediction of ballistic missile in the boost phase.
基金Projects(U22B2084,52275483,52075142)supported by the National Natural Science Foundation of ChinaProject(2023ZY01050)supported by the Ministry of Industry and Information Technology High Quality Development,China。
文摘The gears of new energy vehicles are required to withstand higher rotational speeds and greater loads,which puts forward higher precision essentials for gear manufacturing.However,machining process parameters can cause changes in cutting force/heat,resulting in affecting gear machining precision.Therefore,this paper studies the effect of different process parameters on gear machining precision.A multi-objective optimization model is established for the relationship between process parameters and tooth surface deviations,tooth profile deviations,and tooth lead deviations through the cutting speed,feed rate,and cutting depth of the worm wheel gear grinding machine.The response surface method(RSM)is used for experimental design,and the corresponding experimental results and optimal process parameters are obtained.Subsequently,gray relational analysis-principal component analysis(GRA-PCA),particle swarm optimization(PSO),and genetic algorithm-particle swarm optimization(GA-PSO)methods are used to analyze the experimental results and obtain different optimal process parameters.The results show that optimal process parameters obtained by the GRA-PCA,PSO,and GA-PSO methods improve the gear machining precision.Moreover,the gear machining precision obtained by GA-PSO is superior to other methods.
基金supported by Poongsan-KAIST Future Research Center Projectthe fund support provided by the National Research Foundation of Korea(NRF)grant funded by the Korea government(MSIT)(Grant No.2023R1A2C2005661)。
文摘This study presents a machine learning-based method for predicting fragment velocity distribution in warhead fragmentation under explosive loading condition.The fragment resultant velocities are correlated with key design parameters including casing dimensions and detonation positions.The paper details the finite element analysis for fragmentation,the characterizations of the dynamic hardening and fracture models,the generation of comprehensive datasets,and the training of the ANN model.The results show the influence of casing dimensions on fragment velocity distributions,with the tendencies indicating increased resultant velocity with reduced thickness,increased length and diameter.The model's predictive capability is demonstrated through the accurate predictions for both training and testing datasets,showing its potential for the real-time prediction of fragmentation performance.
文摘[Objective]Accurate prediction of tomato growth height is crucial for optimizing production environments in smart farming.However,current prediction methods predominantly rely on empirical,mechanistic,or learning-based models that utilize either images data or environmental data.These methods fail to fully leverage multi-modal data to capture the diverse aspects of plant growth comprehensively.[Methods]To address this limitation,a two-stage phenotypic feature extraction(PFE)model based on deep learning algorithm of recurrent neural network(RNN)and long short-term memory(LSTM)was developed.The model integrated environment and plant information to provide a holistic understanding of the growth process,emploied phenotypic and temporal feature extractors to comprehensively capture both types of features,enabled a deeper understanding of the interaction between tomato plants and their environment,ultimately leading to highly accurate predictions of growth height.[Results and Discussions]The experimental results showed the model's ef‐fectiveness:When predicting the next two days based on the past five days,the PFE-based RNN and LSTM models achieved mean absolute percentage error(MAPE)of 0.81%and 0.40%,respectively,which were significantly lower than the 8.00%MAPE of the large language model(LLM)and 6.72%MAPE of the Transformer-based model.In longer-term predictions,the 10-day prediction for 4 days ahead and the 30-day prediction for 12 days ahead,the PFE-RNN model continued to outperform the other two baseline models,with MAPE of 2.66%and 14.05%,respectively.[Conclusions]The proposed method,which leverages phenotypic-temporal collaboration,shows great potential for intelligent,data-driven management of tomato cultivation,making it a promising approach for enhancing the efficiency and precision of smart tomato planting management.