A novel method for noise removal from the rotating accelerometer gravity gradiometer(MAGG)is presented.It introduces a head-to-tail data expansion technique based on the zero-phase filtering principle.A scheme for det...A novel method for noise removal from the rotating accelerometer gravity gradiometer(MAGG)is presented.It introduces a head-to-tail data expansion technique based on the zero-phase filtering principle.A scheme for determining band-pass filter parameters based on signal-to-noise ratio gain,smoothness index,and cross-correlation coefficient is designed using the Chebyshev optimal consistent approximation theory.Additionally,a wavelet denoising evaluation function is constructed,with the dmey wavelet basis function identified as most effective for processing gravity gradient data.The results of hard-in-the-loop simulation and prototype experiments show that the proposed processing method has shown a 14%improvement in the measurement variance of gravity gradient signals,and the measurement accuracy has reached within 4E,compared to other commonly used methods,which verifies that the proposed method effectively removes noise from the gradient signals,improved gravity gradiometry accuracy,and has certain technical insights for high-precision airborne gravity gradiometry.展开更多
Data processing of small samples is an important and valuable research problem in the electronic equipment test. Because it is difficult and complex to determine the probability distribution of small samples, it is di...Data processing of small samples is an important and valuable research problem in the electronic equipment test. Because it is difficult and complex to determine the probability distribution of small samples, it is difficult to use the traditional probability theory to process the samples and assess the degree of uncertainty. Using the grey relational theory and the norm theory, the grey distance information approach, which is based on the grey distance information quantity of a sample and the average grey distance information quantity of the samples, is proposed in this article. The definitions of the grey distance information quantity of a sample and the average grey distance information quantity of the samples, with their characteristics and algorithms, are introduced. The correlative problems, including the algorithm of estimated value, the standard deviation, and the acceptance and rejection criteria of the samples and estimated results, are also proposed. Moreover, the information whitening ratio is introduced to select the weight algorithm and to compare the different samples. Several examples are given to demonstrate the application of the proposed approach. The examples show that the proposed approach, which has no demand for the probability distribution of small samples, is feasible and effective.展开更多
Due to the limited scenes that synthetic aperture radar(SAR)satellites can detect,the full-track utilization rate is not high.Because of the computing and storage limitation of one satellite,it is difficult to process...Due to the limited scenes that synthetic aperture radar(SAR)satellites can detect,the full-track utilization rate is not high.Because of the computing and storage limitation of one satellite,it is difficult to process large amounts of data of spaceborne synthetic aperture radars.It is proposed to use a new method of networked satellite data processing for improving the efficiency of data processing.A multi-satellite distributed SAR real-time processing method based on Chirp Scaling(CS)imaging algorithm is studied in this paper,and a distributed data processing system is built with field programmable gate array(FPGA)chips as the kernel.Different from the traditional CS algorithm processing,the system divides data processing into three stages.The computing tasks are reasonably allocated to different data processing units(i.e.,satellites)in each stage.The method effectively saves computing and storage resources of satellites,improves the utilization rate of a single satellite,and shortens the data processing time.Gaofen-3(GF-3)satellite SAR raw data is processed by the system,with the performance of the method verified.展开更多
Due to the increasing number of cloud applications,the amount of data in the cloud shows signs of growing faster than ever before.The nature of cloud computing requires cloud data processing systems that can handle hu...Due to the increasing number of cloud applications,the amount of data in the cloud shows signs of growing faster than ever before.The nature of cloud computing requires cloud data processing systems that can handle huge volumes of data and have high performance.However,most cloud storage systems currently adopt a hash-like approach to retrieving data that only supports simple keyword-based enquiries,but lacks various forms of information search.Therefore,a scalable and efficient indexing scheme is clearly required.In this paper,we present a skip list-based cloud index,called SLC-index,which is a novel,scalable skip list-based indexing for cloud data processing.The SLC-index offers a two-layered architecture for extending indexing scope and facilitating better throughput.Dynamic load-balancing for the SLC-index is achieved by online migration of index nodes between servers.Furthermore,it is a flexible system due to its dynamic addition and removal of servers.The SLC-index is efficient for both point and range queries.Experimental results show the efficiency of the SLC-index and its usefulness as an alternative approach for cloud-suitable data structures.展开更多
The inter-agency government information sharing(IAGIS)plays an important role in improving service and efficiency of government agencies.Currently,there is still no effective and secure way for data-driven IAGIS to fu...The inter-agency government information sharing(IAGIS)plays an important role in improving service and efficiency of government agencies.Currently,there is still no effective and secure way for data-driven IAGIS to fulfill dynamic demands of information sharing between government agencies.Motivated by blockchain and data mining,a data-driven framework is proposed for IAGIS in this paper.Firstly,the blockchain is used as the core to design the whole framework for monitoring and preventing leakage and abuse of government information,in order to guarantee information security.Secondly,a four-layer architecture is designed for implementing the proposed framework.Thirdly,the classical data mining algorithms PageRank and Apriori are applied to dynamically design smart contracts for information sharing,for the purposed of flexibly adjusting the information sharing strategies according to the practical demands of government agencies for public management and public service.Finally,a case study is presented to illustrate the operation of the proposed framework.展开更多
Hefei Light Source(HLS)is a synchrotron radiation light source that primarily produces vacuum ultraviolet and soft X-rays.It currently consists of ten experimental stations,including a soft X-ray microscopy station.As...Hefei Light Source(HLS)is a synchrotron radiation light source that primarily produces vacuum ultraviolet and soft X-rays.It currently consists of ten experimental stations,including a soft X-ray microscopy station.As part of its on-going efforts to establish a centralized scientific data management platform,HLS is in the process of developing a test sys-tem that covers the entire lifecycle of scientific data,including data generation,acquisition,processing,analysis,and de-struction.However,the instruments used in the soft X-ray microscopy experimental station rely on commercial propriet-ary software for data acquisition and processing.We developed a semi-automatic data acquisition program to facilitate the integration of soft X-ray microscopy stations into a centralized scientific data management platform.Additionally,we cre-ated an online data processing platform to assist users in analyzing their scientific data.The system we developed and de-ployed meets the design requirements,successfully integrating the soft X-ray microscopy station into the full lifecycle management of scientific data.展开更多
Sensor networks provide means to link people with real world by processing data in real time collected from real-world and routing the query results to the right people. Application examples include continuous monitor...Sensor networks provide means to link people with real world by processing data in real time collected from real-world and routing the query results to the right people. Application examples include continuous monitoring of environment, building infrastructures and human health. Many researchers view the sensor networks as databases, and the monitoring tasks are performed as subscriptions, queries, and alert. However, this point is not precise. First, databases can only deal with well-formed data types, with well-defined schema for their interpretation, while the raw data collected by the sensor networks, in most cases, do not fit to this requirement. Second, sensor networks have to deal with very dynamic targets, environment and resources, while databases are more static. In order to fill this gap between sensor networks and databases, we propose a novel approach, referred to as 'spatiotemporal data stream segmentation', or 'stream segmentation' for short, to address the dynamic nature and deal with 'raw' data of sensor networks. Stream segmentation is defined using Bayesian Networks in the context of sensor networks, and two application examples are given to demonstrate the usefulness of the approach.展开更多
Calibration is a processing procedure for across-track interferometric synthetic aperture radar (InSAR) to achieve an accurate three-dimensional location. A calibration technique, called weighted joint calibration, ...Calibration is a processing procedure for across-track interferometric synthetic aperture radar (InSAR) to achieve an accurate three-dimensional location. A calibration technique, called weighted joint calibration, for the generation of wide-area geocoded digital elevation models (DEMs) is proposed. It cali- brates multiple InSAR scenes simultaneously, and allows reducing the number of required ground control points (GCPs) by using tie points (TPs). This approach may ensure the continuity of three- dimensional location among adjacent scenes, which is necessary for mosaic and fusion of data coming from different scenes. In addition, it introduces weights to calibration to discriminate GCPs and TPs with different coherences and locations. This paper presents the principles and methodology of this weighted joint calibration technique and illustrates its successful application in airborne In- SAR data.展开更多
In this paper, decision mechanism of credit-risk for banks is studied when the loan interest rate is fixed with asymmetry information in credit market. We give out the designs of rationing and non-rationing on credit ...In this paper, decision mechanism of credit-risk for banks is studied when the loan interest rate is fixed with asymmetry information in credit market. We give out the designs of rationing and non-rationing on credit risky decision mechanism when collateral value provided by an entrepreneur is not less than the minimum demands of the bank. It shows that under the action of the mechanism, banks could efficiently identify the risk size of the project. Finally, the condition of the project investigation of bank is given over again.展开更多
The computer control techniques applicable to electronically scanned multifunction radars are presented. The software and hardware architecture for the real time control and the data processing within a phased array ...The computer control techniques applicable to electronically scanned multifunction radars are presented. The software and hardware architecture for the real time control and the data processing within a phased array radar are described. The software system comprising a number of tasks is written in C language and implemented. The results show that the algorithm for the multitask adaptive scheduling and the multitarget data processing is suitable for multifunction phased array radars.展开更多
This paper takes the Sobel operator as example to study parallel sequential algorithm onto a memory-sharing multiprocessor by using a virtual machine. Several different parallel algorithms using function decomposition...This paper takes the Sobel operator as example to study parallel sequential algorithm onto a memory-sharing multiprocessor by using a virtual machine. Several different parallel algorithms using function decomposition and/or data decomposition methods are compared and their performances are analyzed in terms of processor utilization, data traffic, shared memory access, and synchronization overhead. The analysis is validated through a simulation experiment on the virtual machine of 64 parallel processors. Conclusions are presented at the end of this paper.展开更多
文摘A novel method for noise removal from the rotating accelerometer gravity gradiometer(MAGG)is presented.It introduces a head-to-tail data expansion technique based on the zero-phase filtering principle.A scheme for determining band-pass filter parameters based on signal-to-noise ratio gain,smoothness index,and cross-correlation coefficient is designed using the Chebyshev optimal consistent approximation theory.Additionally,a wavelet denoising evaluation function is constructed,with the dmey wavelet basis function identified as most effective for processing gravity gradient data.The results of hard-in-the-loop simulation and prototype experiments show that the proposed processing method has shown a 14%improvement in the measurement variance of gravity gradient signals,and the measurement accuracy has reached within 4E,compared to other commonly used methods,which verifies that the proposed method effectively removes noise from the gradient signals,improved gravity gradiometry accuracy,and has certain technical insights for high-precision airborne gravity gradiometry.
文摘Data processing of small samples is an important and valuable research problem in the electronic equipment test. Because it is difficult and complex to determine the probability distribution of small samples, it is difficult to use the traditional probability theory to process the samples and assess the degree of uncertainty. Using the grey relational theory and the norm theory, the grey distance information approach, which is based on the grey distance information quantity of a sample and the average grey distance information quantity of the samples, is proposed in this article. The definitions of the grey distance information quantity of a sample and the average grey distance information quantity of the samples, with their characteristics and algorithms, are introduced. The correlative problems, including the algorithm of estimated value, the standard deviation, and the acceptance and rejection criteria of the samples and estimated results, are also proposed. Moreover, the information whitening ratio is introduced to select the weight algorithm and to compare the different samples. Several examples are given to demonstrate the application of the proposed approach. The examples show that the proposed approach, which has no demand for the probability distribution of small samples, is feasible and effective.
基金Project(2017YFC1405600)supported by the National Key R&D Program of ChinaProject(18JK05032)supported by the Scientific Research Project of Education Department of Shaanxi Province,China。
文摘Due to the limited scenes that synthetic aperture radar(SAR)satellites can detect,the full-track utilization rate is not high.Because of the computing and storage limitation of one satellite,it is difficult to process large amounts of data of spaceborne synthetic aperture radars.It is proposed to use a new method of networked satellite data processing for improving the efficiency of data processing.A multi-satellite distributed SAR real-time processing method based on Chirp Scaling(CS)imaging algorithm is studied in this paper,and a distributed data processing system is built with field programmable gate array(FPGA)chips as the kernel.Different from the traditional CS algorithm processing,the system divides data processing into three stages.The computing tasks are reasonably allocated to different data processing units(i.e.,satellites)in each stage.The method effectively saves computing and storage resources of satellites,improves the utilization rate of a single satellite,and shortens the data processing time.Gaofen-3(GF-3)satellite SAR raw data is processed by the system,with the performance of the method verified.
基金Projects(61363021,61540061,61663047)supported by the National Natural Science Foundation of ChinaProject(2017SE206)supported by the Open Foundation of Key Laboratory in Software Engineering of Yunnan Province,China
文摘Due to the increasing number of cloud applications,the amount of data in the cloud shows signs of growing faster than ever before.The nature of cloud computing requires cloud data processing systems that can handle huge volumes of data and have high performance.However,most cloud storage systems currently adopt a hash-like approach to retrieving data that only supports simple keyword-based enquiries,but lacks various forms of information search.Therefore,a scalable and efficient indexing scheme is clearly required.In this paper,we present a skip list-based cloud index,called SLC-index,which is a novel,scalable skip list-based indexing for cloud data processing.The SLC-index offers a two-layered architecture for extending indexing scope and facilitating better throughput.Dynamic load-balancing for the SLC-index is achieved by online migration of index nodes between servers.Furthermore,it is a flexible system due to its dynamic addition and removal of servers.The SLC-index is efficient for both point and range queries.Experimental results show the efficiency of the SLC-index and its usefulness as an alternative approach for cloud-suitable data structures.
基金Supported by the Project of Guangdong Science and Technology Department(2020B010166005)the Post-Doctoral Research Project(Z000158)+2 种基金the Ministry of Education Social Science Fund(22YJ630167)the Fund project of Department of Science and Technology of Guangdong Province(GDK TP2021032500)the Guangdong Philosophy and Social Science(GD22YYJ15).
文摘The inter-agency government information sharing(IAGIS)plays an important role in improving service and efficiency of government agencies.Currently,there is still no effective and secure way for data-driven IAGIS to fulfill dynamic demands of information sharing between government agencies.Motivated by blockchain and data mining,a data-driven framework is proposed for IAGIS in this paper.Firstly,the blockchain is used as the core to design the whole framework for monitoring and preventing leakage and abuse of government information,in order to guarantee information security.Secondly,a four-layer architecture is designed for implementing the proposed framework.Thirdly,the classical data mining algorithms PageRank and Apriori are applied to dynamically design smart contracts for information sharing,for the purposed of flexibly adjusting the information sharing strategies according to the practical demands of government agencies for public management and public service.Finally,a case study is presented to illustrate the operation of the proposed framework.
基金supported by the Fundamental Research Funds for the Central Universities(WK2310000102)。
文摘Hefei Light Source(HLS)is a synchrotron radiation light source that primarily produces vacuum ultraviolet and soft X-rays.It currently consists of ten experimental stations,including a soft X-ray microscopy station.As part of its on-going efforts to establish a centralized scientific data management platform,HLS is in the process of developing a test sys-tem that covers the entire lifecycle of scientific data,including data generation,acquisition,processing,analysis,and de-struction.However,the instruments used in the soft X-ray microscopy experimental station rely on commercial propriet-ary software for data acquisition and processing.We developed a semi-automatic data acquisition program to facilitate the integration of soft X-ray microscopy stations into a centralized scientific data management platform.Additionally,we cre-ated an online data processing platform to assist users in analyzing their scientific data.The system we developed and de-ployed meets the design requirements,successfully integrating the soft X-ray microscopy station into the full lifecycle management of scientific data.
文摘Sensor networks provide means to link people with real world by processing data in real time collected from real-world and routing the query results to the right people. Application examples include continuous monitoring of environment, building infrastructures and human health. Many researchers view the sensor networks as databases, and the monitoring tasks are performed as subscriptions, queries, and alert. However, this point is not precise. First, databases can only deal with well-formed data types, with well-defined schema for their interpretation, while the raw data collected by the sensor networks, in most cases, do not fit to this requirement. Second, sensor networks have to deal with very dynamic targets, environment and resources, while databases are more static. In order to fill this gap between sensor networks and databases, we propose a novel approach, referred to as 'spatiotemporal data stream segmentation', or 'stream segmentation' for short, to address the dynamic nature and deal with 'raw' data of sensor networks. Stream segmentation is defined using Bayesian Networks in the context of sensor networks, and two application examples are given to demonstrate the usefulness of the approach.
基金supported in part by the National High-Tech Research and Development Program of China(863 Program)(2007AA120302)the National Basic Research Program of China(973 Program)(2009CB724003)
文摘Calibration is a processing procedure for across-track interferometric synthetic aperture radar (InSAR) to achieve an accurate three-dimensional location. A calibration technique, called weighted joint calibration, for the generation of wide-area geocoded digital elevation models (DEMs) is proposed. It cali- brates multiple InSAR scenes simultaneously, and allows reducing the number of required ground control points (GCPs) by using tie points (TPs). This approach may ensure the continuity of three- dimensional location among adjacent scenes, which is necessary for mosaic and fusion of data coming from different scenes. In addition, it introduces weights to calibration to discriminate GCPs and TPs with different coherences and locations. This paper presents the principles and methodology of this weighted joint calibration technique and illustrates its successful application in airborne In- SAR data.
基金This project was supported by Fubangs Science & Technology Company Ltd.
文摘In this paper, decision mechanism of credit-risk for banks is studied when the loan interest rate is fixed with asymmetry information in credit market. We give out the designs of rationing and non-rationing on credit risky decision mechanism when collateral value provided by an entrepreneur is not less than the minimum demands of the bank. It shows that under the action of the mechanism, banks could efficiently identify the risk size of the project. Finally, the condition of the project investigation of bank is given over again.
文摘The computer control techniques applicable to electronically scanned multifunction radars are presented. The software and hardware architecture for the real time control and the data processing within a phased array radar are described. The software system comprising a number of tasks is written in C language and implemented. The results show that the algorithm for the multitask adaptive scheduling and the multitarget data processing is suitable for multifunction phased array radars.
文摘This paper takes the Sobel operator as example to study parallel sequential algorithm onto a memory-sharing multiprocessor by using a virtual machine. Several different parallel algorithms using function decomposition and/or data decomposition methods are compared and their performances are analyzed in terms of processor utilization, data traffic, shared memory access, and synchronization overhead. The analysis is validated through a simulation experiment on the virtual machine of 64 parallel processors. Conclusions are presented at the end of this paper.