A novel method for noise removal from the rotating accelerometer gravity gradiometer(MAGG)is presented.It introduces a head-to-tail data expansion technique based on the zero-phase filtering principle.A scheme for det...A novel method for noise removal from the rotating accelerometer gravity gradiometer(MAGG)is presented.It introduces a head-to-tail data expansion technique based on the zero-phase filtering principle.A scheme for determining band-pass filter parameters based on signal-to-noise ratio gain,smoothness index,and cross-correlation coefficient is designed using the Chebyshev optimal consistent approximation theory.Additionally,a wavelet denoising evaluation function is constructed,with the dmey wavelet basis function identified as most effective for processing gravity gradient data.The results of hard-in-the-loop simulation and prototype experiments show that the proposed processing method has shown a 14%improvement in the measurement variance of gravity gradient signals,and the measurement accuracy has reached within 4E,compared to other commonly used methods,which verifies that the proposed method effectively removes noise from the gradient signals,improved gravity gradiometry accuracy,and has certain technical insights for high-precision airborne gravity gradiometry.展开更多
Low-field(nuclear magnetic resonance)NMR has been widely used in petroleum industry,such as well logging and laboratory rock core analysis.However,the signal-to-noise ratio is low due to the low magnetic field strengt...Low-field(nuclear magnetic resonance)NMR has been widely used in petroleum industry,such as well logging and laboratory rock core analysis.However,the signal-to-noise ratio is low due to the low magnetic field strength of NMR tools and the complex petrophysical properties of detected samples.Suppressing the noise and highlighting the available NMR signals is very important for subsequent data processing.Most denoising methods are normally based on fixed mathematical transformation or handdesign feature selectors to suppress noise characteristics,which may not perform well because of their non-adaptive performance to different noisy signals.In this paper,we proposed a“data processing framework”to improve the quality of low field NMR echo data based on dictionary learning.Dictionary learning is a machine learning method based on redundancy and sparse representation theory.Available information in noisy NMR echo data can be adaptively extracted and reconstructed by dictionary learning.The advantages and application effectiveness of the proposed method were verified with a number of numerical simulations,NMR core data analyses,and NMR logging data processing.The results show that dictionary learning can significantly improve the quality of NMR echo data with high noise level and effectively improve the accuracy and reliability of inversion results.展开更多
The data processing mode is vital to the performance of an entire coalmine gas early-warning system, especially in real-time performance. Our objective was to present the structural features of coalmine gas data, so t...The data processing mode is vital to the performance of an entire coalmine gas early-warning system, especially in real-time performance. Our objective was to present the structural features of coalmine gas data, so that the data could be processed at different priority levels in C language. Two different data processing models, one with priority and the other without priority, were built based on queuing theory. Their theoretical formulas were determined via a M/M/I model in order to calculate average occupation time of each measuring point in an early-warning program. We validated the model with the gas early-warning system of the Huaibei Coalmine Group Corp. The results indicate that the average occupation time for gas data processing by using the queuing system model with priority is nearly 1/30 of that of the model without priority.展开更多
One of the most important project missions of neutral beam injectors is the implementation of 100 s neutral beam injection (NBI) with high power energy t.o the plasma of the EAST superconducting tokamak. Correspondi...One of the most important project missions of neutral beam injectors is the implementation of 100 s neutral beam injection (NBI) with high power energy t.o the plasma of the EAST superconducting tokamak. Correspondingly, it's necessary to construct a high-speed and reliable computer data processing system for processing experimental data, such as data acquisition, data compression and storage, data decompression and query, as well as data analysis. The implementation of computer data processing application software (CDPS) for EAST NBI is presented in this paper in terms of its functional structure and system realization. The set of software is programmed in C language and runs on Linux operating system based on TCP network protocol and multi-threading technology. The hardware mainly includes industrial control computer (IPC), data server, PXI DAQ cards and so on. Now this software has been applied to EAST NBI system, and experimental results show that the CDPS can serve EAST NBI very well.展开更多
With the development of Laser Induced Breakdown Spectroscopy (LIBS), increasing numbers of researchers have begun to focus on problems of the application. We are not just satisfied with analyzing what kinds of eleme...With the development of Laser Induced Breakdown Spectroscopy (LIBS), increasing numbers of researchers have begun to focus on problems of the application. We are not just satisfied with analyzing what kinds of elements are in the samples but are also eager to accomplish quantitative detection with LIBS. There are several means to improve the limit of detection and stability, which are important to quantitative detection, especially of trace elements, increasing the laser energy and the resolution of spectrometer, using dual pulse setup, vacuuming the ablation environment etc. All of these methods are about to update the hardware system, which is effective but expensive. So we establish the following spectrum data processing methods to improve the trace elements analysis in this paper: spectrum sifting, noise filtering, and peak fitting. There are small algorithms in these three method groups, which we will introduce in detail. Finally, we discuss how these methods affect the results of trace elements detection in an experiment to analyze the lead content in Chinese cabbage.展开更多
An idea is presented about the development of a data processing and analysis system for ICF experiments, which is based on an object oriented framework. The design and preliminary implementation of the data processing...An idea is presented about the development of a data processing and analysis system for ICF experiments, which is based on an object oriented framework. The design and preliminary implementation of the data processing and analysis framework based on the ROOT system have been completed. Software for unfolding soft X-ray spectra has been developed to test the functions of this framework.展开更多
To improve the detection rate and lower down the false positive rate in intrusion detection system, dimensionality reduction is widely used in the intrusion detection system. For this purpose, a data processing (DP)...To improve the detection rate and lower down the false positive rate in intrusion detection system, dimensionality reduction is widely used in the intrusion detection system. For this purpose, a data processing (DP) with support vector machine (SVM) was built. Different from traditiona/ly identifying the redundant data before purging the audit data by expert knowledge or utilizing different kinds of subsets of the available 41-connection attributes to build a classifier, the proposed strategy first removes the attributes whose correlation with another attribute exceeds a threshold, and then classifies two sequence samples as one class while removing either of the two samples whose similarity exceeds a threshold. The results of performance experiments showed that the strategy of DP and SVM is superior to the other existing data reduction strategies ( e. g. , audit reduction, rule extraction, and feature selection), and that the detection model based on DP and SVM outperforms those based on data mining, soft computing, and hierarchical principal component analysis neural networks.展开更多
A method of fast data processing has been developed to rapidly obtain evolution of the electron density profile for a multichannel polarimeter-interferometer system(POLARIS)on J-TEXT. Compared with the Abel inversio...A method of fast data processing has been developed to rapidly obtain evolution of the electron density profile for a multichannel polarimeter-interferometer system(POLARIS)on J-TEXT. Compared with the Abel inversion method, evolution of the density profile analyzed by this method can quickly offer important information. This method has the advantage of fast calculation speed with the order of ten milliseconds per normal shot and it is capable of processing up to 1 MHz sampled data, which is helpful for studying density sawtooth instability and the disruption between shots. In the duration of a flat-top plasma current of usual ohmic discharges on J-TEXT, shape factor u is ranged from 4 to 5. When the disruption of discharge happens, the density profile becomes peaked and the shape factor u typically decreases to 1.展开更多
The processing of measuri ng data plays an important role in reverse engineering. Based on grey system the ory, we first propose some methods to the processing of measuring data in revers e engineering. The measured d...The processing of measuri ng data plays an important role in reverse engineering. Based on grey system the ory, we first propose some methods to the processing of measuring data in revers e engineering. The measured data usually have some abnormalities. When the abnor mal data are eliminated by filtering, blanks are created. The grey generation an d GM(1,1) are used to create new data for these blanks. For the uneven data sequ en ce created by measuring error, the mean generation is used to smooth it and then the stepwise and smooth generations are used to improve the data sequence.展开更多
As the key ion source component of nuclear fusion auxiliary heating devices, the radio frequency (RF) ion source is developed and applied gradually to offer a source plasma with the advantages of ease of control and...As the key ion source component of nuclear fusion auxiliary heating devices, the radio frequency (RF) ion source is developed and applied gradually to offer a source plasma with the advantages of ease of control and high reliability. In addition, it easily achieves long-pulse steady-state operation. During the process of the development and testing of the RF ion source, a lot of original experimental data will be generated. Therefore, it is necessary to develop a stable and reliable computer data acquisition and processing application system for realizing the functions of data acquisition, storage, access, and real-time monitoring. In this paper, the development of a data acquisition and processing application system for the RF ion source is presented. The hardware platform is based on the PXI system and the software is programmed on the LabVIEW development environment. The key technologies that are used for the implementation of this software programming mainly include the long-pulse data acquisition technology, multi- threading processing technology, transmission control communication protocol, and the Lempel-Ziv-Oberhumer data compression algorithm. Now, this design has been tested and applied on the RF ion source. The test results show that it can work reliably and steadily. With the help of this design, the stable plasma discharge data of the RF ion source are collected, stored, accessed, and monitored in real-time. It is shown that it has a very practical application significance for the RF experiments.展开更多
With permanent down-hole gauges (PDGs) widely installed in oilfields around the world in recent years, a continuous stream of transient pressure data in real time is now available, which motivates a new round of res...With permanent down-hole gauges (PDGs) widely installed in oilfields around the world in recent years, a continuous stream of transient pressure data in real time is now available, which motivates a new round of research interests in further developing pressure transient processing and analysis techniques. Transient pressure measurements from PDG are characterized by long term and high volume data. These data are recorded under unconstrained circumstances, so effects due to noise, rate fluctuation and interference from other wells cannot be avoided. These effects make the measured pressure trends decline or rise and then obscure or distort the actual flow behavior, which makes subsequent analysis difficult. In this paper, the problems encountered in analysis of PDG transient pressure are investigated. A newly developed workflow for processing and analyzing PDG transient pressure data is proposed. Numerical well testing synthetic studies are performed to demonstrate these procedures. The results prove that this new technique works well and the potential for practical application looks very promising.展开更多
Purpose:The interdisciplinary nature and rapid development of the Semantic Web led to the mass publication of RDF data in a large number of widely accepted serialization formats,thus developing out the necessity for R...Purpose:The interdisciplinary nature and rapid development of the Semantic Web led to the mass publication of RDF data in a large number of widely accepted serialization formats,thus developing out the necessity for RDF data processing with specific purposes.The paper reports on an assessment of chief RDF data endpoint challenges and introduces the RDF Adaptor,a set of plugins for RDF data processing which covers the whole life-cycle with high efficiency.Design/methodology/approach:The RDFAdaptor is designed based on the prominent ETL tool—Pentaho Data Integration—which provides a user-friendly and intuitive interface and allows connect to various data sources and formats,and reuses the Java framework RDF4J as middleware that realizes access to data repositories,SPARQL endpoints and all leading RDF database solutions with SPARQL 1.1 support.It can support effortless services with various configuration templates in multi-scenario applications,and help extend data process tasks in other services or tools to complement missing functions.Findings:The proposed comprehensive RDF ETL solution—RDFAdaptor—provides an easy-to-use and intuitive interface,supports data integration and federation over multi-source heterogeneous repositories or endpoints,as well as manage linked data in hybrid storage mode.Research limitations:The plugin set can support several application scenarios of RDF data process,but error detection/check and interaction with other graph repositories remain to be improved.Practical implications:The plugin set can provide user interface and configuration templates which enable its usability in various applications of RDF data generation,multi-format data conversion,remote RDF data migration,and RDF graph update in semantic query process.Originality/value:This is the first attempt to develop components instead of systems that can include extract,consolidate,and store RDF data on the basis of an ecologically mature data warehousing environment.展开更多
In order to obtain diagnostic data with physical meaning,the acquired raw data must be processed through a series of physical formulas or processing algorithms.Some diagnostic data are acquired and processed by the di...In order to obtain diagnostic data with physical meaning,the acquired raw data must be processed through a series of physical formulas or processing algorithms.Some diagnostic data are acquired and processed by the diagnostic systems themselves.The data processing programs are specific and usually run manually,and the processed results of the analytical data are stored in their local disk,which is unshared and unsafe.Thus,it is necessary to integrate all the specific process programs and build an automatic and unified data analysis system with shareable data storage.This paper introduces the design and implementation of the online analysis system.Based on the MDSplus event mechanism,this system deploys synchronous operations for different processing programs.According to the computational complexity and real-time requirements,combined with the programmability of parallel algorithms and hardware costs,the OpenMP parallel processing technology is applied to the EAST analysis system,and significantly enhances the processing efficiency.展开更多
Τhe efficiency of a Mewis propeller duct by the analysis of ship operational data is examined.The analysis employs data collected with high frequency for a three-year period for two siter vessels,one of them fitted w...Τhe efficiency of a Mewis propeller duct by the analysis of ship operational data is examined.The analysis employs data collected with high frequency for a three-year period for two siter vessels,one of them fitted with a Mewis type duct.Our approach to the problem of identifying improvements in the operational performance of the ship equipped with the duct is two-fold.Firstly,we proceed with the calculation of appropriate Key Performance Indicators to monitor vessels performance in time for different operational periods and loading conditions.An extensive pre-processing stage is necessary to prepare a dataset free from datapoints that could impair the analysis,such as outliers,as well as the appropriate preparations for a meaningful KPI calculation.The second approach concerns the development of multiple linear regression problem for the prediction of main engine fuel oil consumption based on operational and weather parameters,such as ship’s speed,mean draft,trim,rudder angle and the wind speed.The aim is to quantify reductions due to the Mewis duct for several scenarios.Key results of the studies reveal a contribution of the Mewis duct mainly in laden condition,for lower speed range and in the long-term period after dry-docking.展开更多
With the growing popularity of data-intensive services on the Internet, the traditional process-centric model for business process meets challenges due to the lack of abilities to describe data semantics and dependenc...With the growing popularity of data-intensive services on the Internet, the traditional process-centric model for business process meets challenges due to the lack of abilities to describe data semantics and dependencies, resulting in the inflexibility of the design and implement for the processes. This paper proposes a novel data-aware business process model which is able to describe both explicit control flow and implicit data flow. Data model with dependencies which are formulated by Linear-time Temporal Logic(LTL) is presented, and their satisfiability is validated by an automaton-based model checking algorithm. Data dependencies are fully considered in modeling phase, which helps to improve the efficiency and reliability of programming during developing phase. Finally, a prototype system based on j BPM for data-aware workflow is designed using such model, and has been deployed to Beijing Kingfore heating management system to validate the flexibility, efficacy and convenience of our approach for massive coding and large-scale system management in reality.展开更多
High resolution of post-stack seismic data assists in better interpretation of subsurface structures as well as high accuracy of impedance inversion. Therefore, geophysicists consistently strive to acquire higher reso...High resolution of post-stack seismic data assists in better interpretation of subsurface structures as well as high accuracy of impedance inversion. Therefore, geophysicists consistently strive to acquire higher resolution seismic images in petroleum exploration. Although there have been successful applications of conventional signal processing and machine learning for post-stack seismic resolution enhancement,there is limited reference to the seismic applications of the recent emergence and rapid development of generative artificial intelligence. Hence, we propose to apply diffusion models, among the most popular generative models, to enhance seismic resolution. Specifically, we apply the classic diffusion model—denoising diffusion probabilistic model(DDPM), conditioned on the seismic data in low resolution, to reconstruct corresponding high-resolution images. Herein the entire scheme is referred to as SeisResoDiff. To provide a comprehensive and clear understanding of SeisResoDiff, we introduce the basic theories of diffusion models and detail the optimization objective's derivation with the aid of diagrams and algorithms. For implementation, we first propose a practical workflow to acquire abundant training data based on the generated pseudo-wells. Subsequently, we apply the trained model to both synthetic and field datasets, evaluating the results in three aspects: the appearance of seismic sections and slices in the time domain, frequency spectra, and comparisons with the synthetic data using real well-logging data at the well locations. The results demonstrate not only effective seismic resolution enhancement,but also additional denoising by the diffusion model. Experimental comparisons indicate that training the model on noisy data, which are more realistic, outperforms training on clean data. The proposed scheme demonstrates superiority over some conventional methods in high-resolution reconstruction and denoising ability, yielding more competitive results compared to our previous research.展开更多
The High-energy Fragment Separator(HFRS),which is currently under construction,is a leading international radioactive beam device.Multiple sets of position-sensitive twin time projection chamber(TPC)detectors are dist...The High-energy Fragment Separator(HFRS),which is currently under construction,is a leading international radioactive beam device.Multiple sets of position-sensitive twin time projection chamber(TPC)detectors are distributed on HFRS for particle identification and beam monitoring.The twin TPCs'readout electronics system operates in a trigger-less mode due to its high counting rate,leading to a challenge of handling large amounts of data.To address this problem,we introduced an event-building algorithm.This algorithm employs a hierarchical processing strategy to compress data during transmission and aggregation.In addition,it reconstructs twin TPCs'events online and stores only the reconstructed particle information,which significantly reduces the burden on data transmission and storage resources.Simulation studies demonstrated that the algorithm accurately matches twin TPCs'events and reduces more than 98%of the data volume at a counting rate of 500 kHz/channel.展开更多
To realize carbon neutrality,there is an urgent need to develop sustainable,green energy systems(especially solar energy systems)owing to the environmental friendliness of solar energy,given the substantial greenhouse...To realize carbon neutrality,there is an urgent need to develop sustainable,green energy systems(especially solar energy systems)owing to the environmental friendliness of solar energy,given the substantial greenhouse gas emissions from fossil fuel-based power sources.When it comes to the evolution of intelligent green energy systems,Internet of Things(IoT)-based green-smart photovoltaic(PV)systems have been brought into the spotlight owing to their cutting-edge sensing and data-processing technologies.This review is focused on three critical segments of IoT-based green-smart PV systems.First,the climatic parameters and sensing technologies for IoT-based PV systems under extreme weather conditions are presented.Second,the methods for processing data from smart sensors are discussed,in order to realize health monitoring of PV systems under extreme environmental conditions.Third,the smart materials applied to sensors and the insulation materials used in PV backsheets are susceptible to aging,and these materials and their aging phenomena are highlighted in this review.This review also offers new perspectives for optimizing the current international standards for green energy systems using big data from IoT-based smart sensors.展开更多
To analyze the errors of processing data, the testing principle for jet elements is introduced and the property of testing system is theoretically and experimentally studied. On the basis of the above, the method of p...To analyze the errors of processing data, the testing principle for jet elements is introduced and the property of testing system is theoretically and experimentally studied. On the basis of the above, the method of processing data is presented and the error formulae, which are the functions of the testing system property, are derived. Finally, the methods of reducing the errors are provided. The measured results are in correspondence with the theoretical conclusion.展开更多
Using GIS,GPS and GPRS,an intelligent monitoring and dispatch system of trucks and shovels in an open pit has been designed and developed.The system can monitor and dispatch open-pit trucks and shovels and play back t...Using GIS,GPS and GPRS,an intelligent monitoring and dispatch system of trucks and shovels in an open pit has been designed and developed.The system can monitor and dispatch open-pit trucks and shovels and play back their historical paths.An intelligent data algorithm is proposed in a practical application.The algorithm can count the times of deliveries of trucks and load- ings of shovels.Experiments on real scenes show that the performance of this system is stable and can satisfy production standards in open pits.展开更多
文摘A novel method for noise removal from the rotating accelerometer gravity gradiometer(MAGG)is presented.It introduces a head-to-tail data expansion technique based on the zero-phase filtering principle.A scheme for determining band-pass filter parameters based on signal-to-noise ratio gain,smoothness index,and cross-correlation coefficient is designed using the Chebyshev optimal consistent approximation theory.Additionally,a wavelet denoising evaluation function is constructed,with the dmey wavelet basis function identified as most effective for processing gravity gradient data.The results of hard-in-the-loop simulation and prototype experiments show that the proposed processing method has shown a 14%improvement in the measurement variance of gravity gradient signals,and the measurement accuracy has reached within 4E,compared to other commonly used methods,which verifies that the proposed method effectively removes noise from the gradient signals,improved gravity gradiometry accuracy,and has certain technical insights for high-precision airborne gravity gradiometry.
基金supported by Science Foundation of China University of Petroleum,Beijing(Grant Number ZX20210024)Chinese Postdoctoral Science Foundation(Grant Number 2021M700172)+1 种基金The Strategic Cooperation Technology Projects of CNPC and CUP(Grant Number ZLZX2020-03)National Natural Science Foundation of China(Grant Number 42004105)
文摘Low-field(nuclear magnetic resonance)NMR has been widely used in petroleum industry,such as well logging and laboratory rock core analysis.However,the signal-to-noise ratio is low due to the low magnetic field strength of NMR tools and the complex petrophysical properties of detected samples.Suppressing the noise and highlighting the available NMR signals is very important for subsequent data processing.Most denoising methods are normally based on fixed mathematical transformation or handdesign feature selectors to suppress noise characteristics,which may not perform well because of their non-adaptive performance to different noisy signals.In this paper,we proposed a“data processing framework”to improve the quality of low field NMR echo data based on dictionary learning.Dictionary learning is a machine learning method based on redundancy and sparse representation theory.Available information in noisy NMR echo data can be adaptively extracted and reconstructed by dictionary learning.The advantages and application effectiveness of the proposed method were verified with a number of numerical simulations,NMR core data analyses,and NMR logging data processing.The results show that dictionary learning can significantly improve the quality of NMR echo data with high noise level and effectively improve the accuracy and reliability of inversion results.
基金Project 70533050 supported by the National Natural Science Foundation of China
文摘The data processing mode is vital to the performance of an entire coalmine gas early-warning system, especially in real-time performance. Our objective was to present the structural features of coalmine gas data, so that the data could be processed at different priority levels in C language. Two different data processing models, one with priority and the other without priority, were built based on queuing theory. Their theoretical formulas were determined via a M/M/I model in order to calculate average occupation time of each measuring point in an early-warning program. We validated the model with the gas early-warning system of the Huaibei Coalmine Group Corp. The results indicate that the average occupation time for gas data processing by using the queuing system model with priority is nearly 1/30 of that of the model without priority.
基金supported by National Natural Science Foundation of China(No.11075183)
文摘One of the most important project missions of neutral beam injectors is the implementation of 100 s neutral beam injection (NBI) with high power energy t.o the plasma of the EAST superconducting tokamak. Correspondingly, it's necessary to construct a high-speed and reliable computer data processing system for processing experimental data, such as data acquisition, data compression and storage, data decompression and query, as well as data analysis. The implementation of computer data processing application software (CDPS) for EAST NBI is presented in this paper in terms of its functional structure and system realization. The set of software is programmed in C language and runs on Linux operating system based on TCP network protocol and multi-threading technology. The hardware mainly includes industrial control computer (IPC), data server, PXI DAQ cards and so on. Now this software has been applied to EAST NBI system, and experimental results show that the CDPS can serve EAST NBI very well.
基金supported by National High-Tech R&D Program(863 Program),China(No.2013AA102402)
文摘With the development of Laser Induced Breakdown Spectroscopy (LIBS), increasing numbers of researchers have begun to focus on problems of the application. We are not just satisfied with analyzing what kinds of elements are in the samples but are also eager to accomplish quantitative detection with LIBS. There are several means to improve the limit of detection and stability, which are important to quantitative detection, especially of trace elements, increasing the laser energy and the resolution of spectrometer, using dual pulse setup, vacuuming the ablation environment etc. All of these methods are about to update the hardware system, which is effective but expensive. So we establish the following spectrum data processing methods to improve the trace elements analysis in this paper: spectrum sifting, noise filtering, and peak fitting. There are small algorithms in these three method groups, which we will introduce in detail. Finally, we discuss how these methods affect the results of trace elements detection in an experiment to analyze the lead content in Chinese cabbage.
基金This project supported by the National High-Tech Research and Development Plan (863-804-3)
文摘An idea is presented about the development of a data processing and analysis system for ICF experiments, which is based on an object oriented framework. The design and preliminary implementation of the data processing and analysis framework based on the ROOT system have been completed. Software for unfolding soft X-ray spectra has been developed to test the functions of this framework.
基金The National Natural Science Foundation ofChina (No.60672049)
文摘To improve the detection rate and lower down the false positive rate in intrusion detection system, dimensionality reduction is widely used in the intrusion detection system. For this purpose, a data processing (DP) with support vector machine (SVM) was built. Different from traditiona/ly identifying the redundant data before purging the audit data by expert knowledge or utilizing different kinds of subsets of the available 41-connection attributes to build a classifier, the proposed strategy first removes the attributes whose correlation with another attribute exceeds a threshold, and then classifies two sequence samples as one class while removing either of the two samples whose similarity exceeds a threshold. The results of performance experiments showed that the strategy of DP and SVM is superior to the other existing data reduction strategies ( e. g. , audit reduction, rule extraction, and feature selection), and that the detection model based on DP and SVM outperforms those based on data mining, soft computing, and hierarchical principal component analysis neural networks.
基金supported by the National Magnetic Confinement Fusion Science Program of China(Nos.2014GB106000,2014GB106002,and2014GB106003)National Natural Science Foundation of China(Nos.11275234,11375237 and 11505238)Scientific Research Grant of Hefei Science Center of CAS(No.2015SRG-HSC010)
文摘A method of fast data processing has been developed to rapidly obtain evolution of the electron density profile for a multichannel polarimeter-interferometer system(POLARIS)on J-TEXT. Compared with the Abel inversion method, evolution of the density profile analyzed by this method can quickly offer important information. This method has the advantage of fast calculation speed with the order of ten milliseconds per normal shot and it is capable of processing up to 1 MHz sampled data, which is helpful for studying density sawtooth instability and the disruption between shots. In the duration of a flat-top plasma current of usual ohmic discharges on J-TEXT, shape factor u is ranged from 4 to 5. When the disruption of discharge happens, the density profile becomes peaked and the shape factor u typically decreases to 1.
文摘The processing of measuri ng data plays an important role in reverse engineering. Based on grey system the ory, we first propose some methods to the processing of measuring data in revers e engineering. The measured data usually have some abnormalities. When the abnor mal data are eliminated by filtering, blanks are created. The grey generation an d GM(1,1) are used to create new data for these blanks. For the uneven data sequ en ce created by measuring error, the mean generation is used to smooth it and then the stepwise and smooth generations are used to improve the data sequence.
基金the NBI team and the partial support of National Natural Science Foundation of China (No. 61363019)National Natural Science Foundation of Qinghai Province (No. 2014-ZJ-718)
文摘As the key ion source component of nuclear fusion auxiliary heating devices, the radio frequency (RF) ion source is developed and applied gradually to offer a source plasma with the advantages of ease of control and high reliability. In addition, it easily achieves long-pulse steady-state operation. During the process of the development and testing of the RF ion source, a lot of original experimental data will be generated. Therefore, it is necessary to develop a stable and reliable computer data acquisition and processing application system for realizing the functions of data acquisition, storage, access, and real-time monitoring. In this paper, the development of a data acquisition and processing application system for the RF ion source is presented. The hardware platform is based on the PXI system and the software is programmed on the LabVIEW development environment. The key technologies that are used for the implementation of this software programming mainly include the long-pulse data acquisition technology, multi- threading processing technology, transmission control communication protocol, and the Lempel-Ziv-Oberhumer data compression algorithm. Now, this design has been tested and applied on the RF ion source. The test results show that it can work reliably and steadily. With the help of this design, the stable plasma discharge data of the RF ion source are collected, stored, accessed, and monitored in real-time. It is shown that it has a very practical application significance for the RF experiments.
基金Science Foundation of China University of Petroleum, Beijing (No.YJRC-2011-02)for the financial support during this research
文摘With permanent down-hole gauges (PDGs) widely installed in oilfields around the world in recent years, a continuous stream of transient pressure data in real time is now available, which motivates a new round of research interests in further developing pressure transient processing and analysis techniques. Transient pressure measurements from PDG are characterized by long term and high volume data. These data are recorded under unconstrained circumstances, so effects due to noise, rate fluctuation and interference from other wells cannot be avoided. These effects make the measured pressure trends decline or rise and then obscure or distort the actual flow behavior, which makes subsequent analysis difficult. In this paper, the problems encountered in analysis of PDG transient pressure are investigated. A newly developed workflow for processing and analyzing PDG transient pressure data is proposed. Numerical well testing synthetic studies are performed to demonstrate these procedures. The results prove that this new technique works well and the potential for practical application looks very promising.
基金This work is supported by“National Social Science Foundation in China”Project(19BTQ061)“Integration and Development on A Next Generation of Open Knowledge Services System and Key Technologies”project(2020XM05).
文摘Purpose:The interdisciplinary nature and rapid development of the Semantic Web led to the mass publication of RDF data in a large number of widely accepted serialization formats,thus developing out the necessity for RDF data processing with specific purposes.The paper reports on an assessment of chief RDF data endpoint challenges and introduces the RDF Adaptor,a set of plugins for RDF data processing which covers the whole life-cycle with high efficiency.Design/methodology/approach:The RDFAdaptor is designed based on the prominent ETL tool—Pentaho Data Integration—which provides a user-friendly and intuitive interface and allows connect to various data sources and formats,and reuses the Java framework RDF4J as middleware that realizes access to data repositories,SPARQL endpoints and all leading RDF database solutions with SPARQL 1.1 support.It can support effortless services with various configuration templates in multi-scenario applications,and help extend data process tasks in other services or tools to complement missing functions.Findings:The proposed comprehensive RDF ETL solution—RDFAdaptor—provides an easy-to-use and intuitive interface,supports data integration and federation over multi-source heterogeneous repositories or endpoints,as well as manage linked data in hybrid storage mode.Research limitations:The plugin set can support several application scenarios of RDF data process,but error detection/check and interaction with other graph repositories remain to be improved.Practical implications:The plugin set can provide user interface and configuration templates which enable its usability in various applications of RDF data generation,multi-format data conversion,remote RDF data migration,and RDF graph update in semantic query process.Originality/value:This is the first attempt to develop components instead of systems that can include extract,consolidate,and store RDF data on the basis of an ecologically mature data warehousing environment.
基金supported by the Chinese National Fusion Project for ITER(No.2012GB105000)Anhui Provincial Natural Science University Research Project(No.KJ2012A144)+2 种基金The Grants for Scientific Research of BSKY(No.XJ201125)from Anhui Medical University,ChinaAnhui Provincial Science Foundation for Outstanding Young Talent(No.2012SQRL265)Young and Middle-Aged Academic Backbone Finance Fund from Anhui Medical University,China
文摘In order to obtain diagnostic data with physical meaning,the acquired raw data must be processed through a series of physical formulas or processing algorithms.Some diagnostic data are acquired and processed by the diagnostic systems themselves.The data processing programs are specific and usually run manually,and the processed results of the analytical data are stored in their local disk,which is unshared and unsafe.Thus,it is necessary to integrate all the specific process programs and build an automatic and unified data analysis system with shareable data storage.This paper introduces the design and implementation of the online analysis system.Based on the MDSplus event mechanism,this system deploys synchronous operations for different processing programs.According to the computational complexity and real-time requirements,combined with the programmability of parallel algorithms and hardware costs,the OpenMP parallel processing technology is applied to the EAST analysis system,and significantly enhances the processing efficiency.
文摘Τhe efficiency of a Mewis propeller duct by the analysis of ship operational data is examined.The analysis employs data collected with high frequency for a three-year period for two siter vessels,one of them fitted with a Mewis type duct.Our approach to the problem of identifying improvements in the operational performance of the ship equipped with the duct is two-fold.Firstly,we proceed with the calculation of appropriate Key Performance Indicators to monitor vessels performance in time for different operational periods and loading conditions.An extensive pre-processing stage is necessary to prepare a dataset free from datapoints that could impair the analysis,such as outliers,as well as the appropriate preparations for a meaningful KPI calculation.The second approach concerns the development of multiple linear regression problem for the prediction of main engine fuel oil consumption based on operational and weather parameters,such as ship’s speed,mean draft,trim,rudder angle and the wind speed.The aim is to quantify reductions due to the Mewis duct for several scenarios.Key results of the studies reveal a contribution of the Mewis duct mainly in laden condition,for lower speed range and in the long-term period after dry-docking.
基金supported by the National Natural Science Foundation of China (No. 61502043, No. 61132001)Beijing Natural Science Foundation (No. 4162042)BeiJing Talents Fund (No. 2015000020124G082)
文摘With the growing popularity of data-intensive services on the Internet, the traditional process-centric model for business process meets challenges due to the lack of abilities to describe data semantics and dependencies, resulting in the inflexibility of the design and implement for the processes. This paper proposes a novel data-aware business process model which is able to describe both explicit control flow and implicit data flow. Data model with dependencies which are formulated by Linear-time Temporal Logic(LTL) is presented, and their satisfiability is validated by an automaton-based model checking algorithm. Data dependencies are fully considered in modeling phase, which helps to improve the efficiency and reliability of programming during developing phase. Finally, a prototype system based on j BPM for data-aware workflow is designed using such model, and has been deployed to Beijing Kingfore heating management system to validate the flexibility, efficacy and convenience of our approach for massive coding and large-scale system management in reality.
基金supported by the National Natural Science Foundation of China (NSFC): Grant number 42274147。
文摘High resolution of post-stack seismic data assists in better interpretation of subsurface structures as well as high accuracy of impedance inversion. Therefore, geophysicists consistently strive to acquire higher resolution seismic images in petroleum exploration. Although there have been successful applications of conventional signal processing and machine learning for post-stack seismic resolution enhancement,there is limited reference to the seismic applications of the recent emergence and rapid development of generative artificial intelligence. Hence, we propose to apply diffusion models, among the most popular generative models, to enhance seismic resolution. Specifically, we apply the classic diffusion model—denoising diffusion probabilistic model(DDPM), conditioned on the seismic data in low resolution, to reconstruct corresponding high-resolution images. Herein the entire scheme is referred to as SeisResoDiff. To provide a comprehensive and clear understanding of SeisResoDiff, we introduce the basic theories of diffusion models and detail the optimization objective's derivation with the aid of diagrams and algorithms. For implementation, we first propose a practical workflow to acquire abundant training data based on the generated pseudo-wells. Subsequently, we apply the trained model to both synthetic and field datasets, evaluating the results in three aspects: the appearance of seismic sections and slices in the time domain, frequency spectra, and comparisons with the synthetic data using real well-logging data at the well locations. The results demonstrate not only effective seismic resolution enhancement,but also additional denoising by the diffusion model. Experimental comparisons indicate that training the model on noisy data, which are more realistic, outperforms training on clean data. The proposed scheme demonstrates superiority over some conventional methods in high-resolution reconstruction and denoising ability, yielding more competitive results compared to our previous research.
基金partially supported by the Strategic Priority Research Program of Chinese Academy of Science(No.XDB 34030000)the National Natural Science Foundation of China(Nos.11975293 and 12205348)。
文摘The High-energy Fragment Separator(HFRS),which is currently under construction,is a leading international radioactive beam device.Multiple sets of position-sensitive twin time projection chamber(TPC)detectors are distributed on HFRS for particle identification and beam monitoring.The twin TPCs'readout electronics system operates in a trigger-less mode due to its high counting rate,leading to a challenge of handling large amounts of data.To address this problem,we introduced an event-building algorithm.This algorithm employs a hierarchical processing strategy to compress data during transmission and aggregation.In addition,it reconstructs twin TPCs'events online and stores only the reconstructed particle information,which significantly reduces the burden on data transmission and storage resources.Simulation studies demonstrated that the algorithm accurately matches twin TPCs'events and reduces more than 98%of the data volume at a counting rate of 500 kHz/channel.
基金National Key R&D Program of China(Grant No.2023YFE0114600)The National Natural Science Foundation of China(NSFC)-(Grant No.52477029)+1 种基金Joint Laboratory of China-Morocco Green Energy and Advanced Materials,The Youth Innovation Team of Shaanxi Universities,The Xi’an City Science and Technology Project(No.23GXFW0070)Xi’an International Science and Technology Cooperation Base.
文摘To realize carbon neutrality,there is an urgent need to develop sustainable,green energy systems(especially solar energy systems)owing to the environmental friendliness of solar energy,given the substantial greenhouse gas emissions from fossil fuel-based power sources.When it comes to the evolution of intelligent green energy systems,Internet of Things(IoT)-based green-smart photovoltaic(PV)systems have been brought into the spotlight owing to their cutting-edge sensing and data-processing technologies.This review is focused on three critical segments of IoT-based green-smart PV systems.First,the climatic parameters and sensing technologies for IoT-based PV systems under extreme weather conditions are presented.Second,the methods for processing data from smart sensors are discussed,in order to realize health monitoring of PV systems under extreme environmental conditions.Third,the smart materials applied to sensors and the insulation materials used in PV backsheets are susceptible to aging,and these materials and their aging phenomena are highlighted in this review.This review also offers new perspectives for optimizing the current international standards for green energy systems using big data from IoT-based smart sensors.
文摘To analyze the errors of processing data, the testing principle for jet elements is introduced and the property of testing system is theoretically and experimentally studied. On the basis of the above, the method of processing data is presented and the error formulae, which are the functions of the testing system property, are derived. Finally, the methods of reducing the errors are provided. The measured results are in correspondence with the theoretical conclusion.
文摘Using GIS,GPS and GPRS,an intelligent monitoring and dispatch system of trucks and shovels in an open pit has been designed and developed.The system can monitor and dispatch open-pit trucks and shovels and play back their historical paths.An intelligent data algorithm is proposed in a practical application.The algorithm can count the times of deliveries of trucks and load- ings of shovels.Experiments on real scenes show that the performance of this system is stable and can satisfy production standards in open pits.