The Wide Field Survey Telescope(WFST)is located at 4200 m on Saishiteng Mountain in Lenghu,Qinghai Province,China.It features a primary mirror with a diameter of 2.5 m and a camera equipped with nine CCDs,providing a ...The Wide Field Survey Telescope(WFST)is located at 4200 m on Saishiteng Mountain in Lenghu,Qinghai Province,China.It features a primary mirror with a diameter of 2.5 m and a camera equipped with nine CCDs,providing a wide field of view of approximately 3×3 square degrees.Calibration parameters are essential to ensure the precision of astrometric observations with the WFST.These parameters are derived from geometric distortion(GD)and gaps through astrometric modeling and are subsequently validated via the Yao’An High Precision Telescope(YAHPT).The GD solutions show maximum distortions between 1.18 and 10.29 pixels for the WFST chips,with central chips exhibiting lower distortion.After applying the GD correction,the precision of the WFST reaches 4 mas.The interchip gaps of the WFST range from 1.922 mm to 7.765 mm,corresponding to 10μm/pixel,aligning with the design and measurements.The calibrated parameters guarantee that the WFST can perform highly accurate astrometric measurements.Furthermore,as the WFST undergoes updates,the parameter model remains consistently applicable.展开更多
The inter-agency government information sharing(IAGIS)plays an important role in improving service and efficiency of government agencies.Currently,there is still no effective and secure way for data-driven IAGIS to fu...The inter-agency government information sharing(IAGIS)plays an important role in improving service and efficiency of government agencies.Currently,there is still no effective and secure way for data-driven IAGIS to fulfill dynamic demands of information sharing between government agencies.Motivated by blockchain and data mining,a data-driven framework is proposed for IAGIS in this paper.Firstly,the blockchain is used as the core to design the whole framework for monitoring and preventing leakage and abuse of government information,in order to guarantee information security.Secondly,a four-layer architecture is designed for implementing the proposed framework.Thirdly,the classical data mining algorithms PageRank and Apriori are applied to dynamically design smart contracts for information sharing,for the purposed of flexibly adjusting the information sharing strategies according to the practical demands of government agencies for public management and public service.Finally,a case study is presented to illustrate the operation of the proposed framework.展开更多
A novel method for noise removal from the rotating accelerometer gravity gradiometer(MAGG)is presented.It introduces a head-to-tail data expansion technique based on the zero-phase filtering principle.A scheme for det...A novel method for noise removal from the rotating accelerometer gravity gradiometer(MAGG)is presented.It introduces a head-to-tail data expansion technique based on the zero-phase filtering principle.A scheme for determining band-pass filter parameters based on signal-to-noise ratio gain,smoothness index,and cross-correlation coefficient is designed using the Chebyshev optimal consistent approximation theory.Additionally,a wavelet denoising evaluation function is constructed,with the dmey wavelet basis function identified as most effective for processing gravity gradient data.The results of hard-in-the-loop simulation and prototype experiments show that the proposed processing method has shown a 14%improvement in the measurement variance of gravity gradient signals,and the measurement accuracy has reached within 4E,compared to other commonly used methods,which verifies that the proposed method effectively removes noise from the gradient signals,improved gravity gradiometry accuracy,and has certain technical insights for high-precision airborne gravity gradiometry.展开更多
Source location is the core foundation of microseismic monitoring.To date,commonly used location methods have usually been based on the ray-tracing travel-time technique,which generally adopts an L1 or L2 norm to esta...Source location is the core foundation of microseismic monitoring.To date,commonly used location methods have usually been based on the ray-tracing travel-time technique,which generally adopts an L1 or L2 norm to establish the location objective function.However,the L1 norm usually achieves low location accuracy,whereas the L2 norm is easily affected by large P-wave arrival-time picking errors.In addition,traditional location methods may be affected by the initial iteration point used to find a local optimum location.Furthermore,the P-wave arrival-time data that have travelled long distances are usually poor in quality.To address these problems,this paper presents a microseismic source location method using the Log-Cosh function and distant sensor-removed P-wave arrival data.Its basic principles are as follows:First,the source location objective function is established using the Log-Cosh function.This function has the stability of the L1 norm and location accuracy of the L2 norm.Then,multiple initial points are generated randomly in the mining area,and the established Log-Cosh location objective function is used to obtain multiple corresponding location results.The average value of the 50 location points with the largest data field potential values is treated as the initial location result.Next,the P-wave travel times from the initial location result to triggered sensors are calculated,and then the P-wave arrival data with travel times exceeding 0.2 s are removed.Finally,the aforementioned location steps are repeated with the denoised P-wave arrival dataset to obtain a high-precision location result.Two synthetic events and eight blasting events from the Yongshaba mine,China,were used to test the proposed method.Regardless of whether the P-wave arrival data with long travel times were eliminated,the location error of the proposed method was smaller than that of the L1/L2 norm and trigger-time-based location method(TT1/TT2 method).Furthermore,after eliminating the Pwave arrival data with long travel distances,the location accuracy of these three location methods increased,indicating that the proposed location method has good application prospects.展开更多
Based on the relationship among the geographic events, spatial changes and the database operations, a new automatic (semi-automatic) incremental updating approach of spatio-temporal database (STDB) named as (event-bas...Based on the relationship among the geographic events, spatial changes and the database operations, a new automatic (semi-automatic) incremental updating approach of spatio-temporal database (STDB) named as (event-based) incremental updating (E-BIU) is proposed in this paper. At first, the relationship among the events, spatial changes and the database operations is analyzed, then a total architecture of E-BIU implementation is designed, which includes an event queue, three managers and two sets of rules, each component is presented in detail. The process of the E-BIU of master STDB is described successively. An example of building’s incremental updating is given to illustrate this approach at the end. The result shows that E-BIU is an efficient automatic updating approach for master STDB.展开更多
Due to the increasing number of cloud applications,the amount of data in the cloud shows signs of growing faster than ever before.The nature of cloud computing requires cloud data processing systems that can handle hu...Due to the increasing number of cloud applications,the amount of data in the cloud shows signs of growing faster than ever before.The nature of cloud computing requires cloud data processing systems that can handle huge volumes of data and have high performance.However,most cloud storage systems currently adopt a hash-like approach to retrieving data that only supports simple keyword-based enquiries,but lacks various forms of information search.Therefore,a scalable and efficient indexing scheme is clearly required.In this paper,we present a skip list-based cloud index,called SLC-index,which is a novel,scalable skip list-based indexing for cloud data processing.The SLC-index offers a two-layered architecture for extending indexing scope and facilitating better throughput.Dynamic load-balancing for the SLC-index is achieved by online migration of index nodes between servers.Furthermore,it is a flexible system due to its dynamic addition and removal of servers.The SLC-index is efficient for both point and range queries.Experimental results show the efficiency of the SLC-index and its usefulness as an alternative approach for cloud-suitable data structures.展开更多
To evaluate the fatigue damage reliability of critical members of the Nanjing Yangtze river bridge, according to the stress-number curve and Miner’s rule, the corresponding expressions for calculating the structural ...To evaluate the fatigue damage reliability of critical members of the Nanjing Yangtze river bridge, according to the stress-number curve and Miner’s rule, the corresponding expressions for calculating the structural fatigue damage reliability were derived. Fatigue damage reliability analysis of some critical members of the Nanjing Yangtze river bridge was carried out by using the strain-time histories measured by the structural health monitoring system of the bridge. The corresponding stress spectra were obtained by the real-time rain-flow counting method. Results of fatigue damage were calculated respectively by the reliability method at different reliability and compared with Miner’s rule. The results show that the fatigue damage of critical members of the Nanjing Yangtze river bridge is very small due to its low live-load stress level.展开更多
A variable weight approach was proposed to handle the probability deficiency problem in the evidential reasoning (ER) approach. The probability deficiency problem indicated that the inadequate information in the ass...A variable weight approach was proposed to handle the probability deficiency problem in the evidential reasoning (ER) approach. The probability deficiency problem indicated that the inadequate information in the assessment result should be less than that in the input. However, it was proved that under certain circumstances, the ER approach could not solve the probability deficiency problem. The variable weight approach was based on two assumptions: 1) the greater weight should be given to the rule with more adequate information; 2) the greater weight should be given to the rules with less disparate information. Assessment results of two notional case studies show that 1) the probability deficiency problem is solved using the proposed variable weight approach, and 2) the information with less inadequacy and more disparity is provided for the decision makers to help reach a consensus.展开更多
Due to the limited scenes that synthetic aperture radar(SAR)satellites can detect,the full-track utilization rate is not high.Because of the computing and storage limitation of one satellite,it is difficult to process...Due to the limited scenes that synthetic aperture radar(SAR)satellites can detect,the full-track utilization rate is not high.Because of the computing and storage limitation of one satellite,it is difficult to process large amounts of data of spaceborne synthetic aperture radars.It is proposed to use a new method of networked satellite data processing for improving the efficiency of data processing.A multi-satellite distributed SAR real-time processing method based on Chirp Scaling(CS)imaging algorithm is studied in this paper,and a distributed data processing system is built with field programmable gate array(FPGA)chips as the kernel.Different from the traditional CS algorithm processing,the system divides data processing into three stages.The computing tasks are reasonably allocated to different data processing units(i.e.,satellites)in each stage.The method effectively saves computing and storage resources of satellites,improves the utilization rate of a single satellite,and shortens the data processing time.Gaofen-3(GF-3)satellite SAR raw data is processed by the system,with the performance of the method verified.展开更多
A DMVOCC-MVDA (distributed multiversion optimistic concurrency control with multiversion dynamic adjustment) protocol was presented to process mobile distributed real-time transaction in mobile broadcast environment...A DMVOCC-MVDA (distributed multiversion optimistic concurrency control with multiversion dynamic adjustment) protocol was presented to process mobile distributed real-time transaction in mobile broadcast environments. At the mobile hosts, all transactions perform local pre-validation. The local pre-validation process is carried out against the committed transactions at the server in the last broadcast cycle. Transactions that survive in local pre-validation must be submitted to the server for local final validation. The new protocol eliminates conflicts between mobile read-only and mobile update transactions, and resolves data conflicts flexibly by using multiversion dynamic adjustment of serialization order to avoid unnecessary restarts of transactions. Mobile read-only transactions can be committed with no-blocking, and respond time of mobile read-only transactions is greatly shortened. The tolerance of mobile transactions of disconnections from the broadcast channel is increased. In global validation mobile distributed transactions have to do check to ensure distributed serializability in all participants. The simulation results show that the new concurrency control protocol proposed offers better performance than other protocols in terms of miss rate, restart rate, commit rate. Under high work load (think time is ls) the miss rate of DMVOCC-MVDA is only 14.6%, is significantly lower than that of other protocols. The restart rate of DMVOCC-MVDA is only 32.3%, showing that DMVOCC-MVDA can effectively reduce the restart rate of mobile transactions. And the commit rate of DMVOCC-MVDA is up to 61.2%, which is obviously higher than that of other protocols.展开更多
Video processing is one challenge in collecting vehicle trajectories from unmanned aerial vehicle(UAV) and road boundary estimation is one way to improve the video processing algorithms. However, current methods do no...Video processing is one challenge in collecting vehicle trajectories from unmanned aerial vehicle(UAV) and road boundary estimation is one way to improve the video processing algorithms. However, current methods do not work well for low volume road, which is not well-marked and with noises such as vehicle tracks. A fusion-based method termed Dempster-Shafer-based road detection(DSRD) is proposed to address this issue. This method detects road boundary by combining multiple information sources using Dempster-Shafer theory(DST). In order to test the performance of the proposed method, two field experiments were conducted, one of which was on a highway partially covered by snow and another was on a dense traffic highway. The results show that DSRD is robust and accurate, whose detection rates are 100% and 99.8% compared with manual detection results. Then, DSRD is adopted to improve UAV video processing algorithm, and the vehicle detection and tracking rate are improved by 2.7% and 5.5%,respectively. Also, the computation time has decreased by 5% and 8.3% for two experiments, respectively.展开更多
It is difficult to detect the anomalies whose matching relationship among some data attributes is very different from others’ in a dataset. Aiming at this problem, an approach based on wavelet analysis for detecting ...It is difficult to detect the anomalies whose matching relationship among some data attributes is very different from others’ in a dataset. Aiming at this problem, an approach based on wavelet analysis for detecting and amending anomalous samples was proposed. Taking full advantage of wavelet analysis’ properties of multi-resolution and local analysis, this approach is able to detect and amend anomalous samples effectively. To realize the rapid numeric computation of wavelet translation for a discrete sequence, a modified algorithm based on Newton-Cores formula was also proposed. The experimental result shows that the approach is feasible with good result and good practicality.展开更多
The use of underwater acoustic data has rapidly expanded with the application of multichannel, large-aperture underwater detection arrays. This study presents an underwater acoustic data compression method that is bas...The use of underwater acoustic data has rapidly expanded with the application of multichannel, large-aperture underwater detection arrays. This study presents an underwater acoustic data compression method that is based on compressed sensing. Underwater acoustic signals are transformed into the sparse domain for data storage at a receiving terminal, and the improved orthogonal matching pursuit(IOMP) algorithm is used to reconstruct the original underwater acoustic signals at a data processing terminal. When an increase in sidelobe level occasionally causes a direction of arrival estimation error, the proposed compression method can achieve a 10 times stronger compression for narrowband signals and a 5 times stronger compression for wideband signals than the orthogonal matching pursuit(OMP) algorithm. The IOMP algorithm also reduces the computing time by about 20% more than the original OMP algorithm. The simulation and experimental results are discussed.展开更多
A new file assignment strategy of parallel I/O, which is named heuristic file sorted assignment algorithm was proposed on cluster computing system. Based on the load balancing, it assigns the files to the same disk ac...A new file assignment strategy of parallel I/O, which is named heuristic file sorted assignment algorithm was proposed on cluster computing system. Based on the load balancing, it assigns the files to the same disk according to the similar service time. Firstly, the files were sorted and stored at the set I in descending order in terms of their service time, then one disk of cluster node was selected randomly when the files were to be assigned, and at last the continuous files were taken orderly from the set I to the disk until the disk reached its load maximum. The experimental results show that the new strategy improves the performance by 20.2% when the load of the system is light and by 31.6% when the load is heavy. And the higher the data access rate, the more evident the improvement of the performance obtained by the heuristic file sorted assignment algorithm.展开更多
Based on the scalar wave equation, making use of the ray approximation of the reflected seismic data (CMP or CSP gathers), the authors derive respectively the projection function of the primary waves and multiple wave...Based on the scalar wave equation, making use of the ray approximation of the reflected seismic data (CMP or CSP gathers), the authors derive respectively the projection function of the primary waves and multiple waves at the near offset (CMP or CSP gathers) in the parabolic Radon transform(PRT)domain. From the geometric point, the authors prove that the energy of the reflection still distributes along hyperbola which has higher curvature in the PRT domain and becomes some energy masses. So the primary waves and the multiple waves which interweave each other in ( x, t ) domain can be completely separated, which helps the multiple waves eliminated by filtering or muting. It is important for the analysis of velocity and the separator and elimination of multiple waves.展开更多
基金supported by the Strategic Priority Research Program of the Chinese Academy of Sciences(XDA0350300)the National Natural Science Foundation of China(12203105,12103091,62394351,12073008)the China Manned Space Project(CMS-CSST-2021-A12,CMS-CSST-2021-B10).
文摘The Wide Field Survey Telescope(WFST)is located at 4200 m on Saishiteng Mountain in Lenghu,Qinghai Province,China.It features a primary mirror with a diameter of 2.5 m and a camera equipped with nine CCDs,providing a wide field of view of approximately 3×3 square degrees.Calibration parameters are essential to ensure the precision of astrometric observations with the WFST.These parameters are derived from geometric distortion(GD)and gaps through astrometric modeling and are subsequently validated via the Yao’An High Precision Telescope(YAHPT).The GD solutions show maximum distortions between 1.18 and 10.29 pixels for the WFST chips,with central chips exhibiting lower distortion.After applying the GD correction,the precision of the WFST reaches 4 mas.The interchip gaps of the WFST range from 1.922 mm to 7.765 mm,corresponding to 10μm/pixel,aligning with the design and measurements.The calibrated parameters guarantee that the WFST can perform highly accurate astrometric measurements.Furthermore,as the WFST undergoes updates,the parameter model remains consistently applicable.
基金Supported by the Project of Guangdong Science and Technology Department(2020B010166005)the Post-Doctoral Research Project(Z000158)+2 种基金the Ministry of Education Social Science Fund(22YJ630167)the Fund project of Department of Science and Technology of Guangdong Province(GDK TP2021032500)the Guangdong Philosophy and Social Science(GD22YYJ15).
文摘The inter-agency government information sharing(IAGIS)plays an important role in improving service and efficiency of government agencies.Currently,there is still no effective and secure way for data-driven IAGIS to fulfill dynamic demands of information sharing between government agencies.Motivated by blockchain and data mining,a data-driven framework is proposed for IAGIS in this paper.Firstly,the blockchain is used as the core to design the whole framework for monitoring and preventing leakage and abuse of government information,in order to guarantee information security.Secondly,a four-layer architecture is designed for implementing the proposed framework.Thirdly,the classical data mining algorithms PageRank and Apriori are applied to dynamically design smart contracts for information sharing,for the purposed of flexibly adjusting the information sharing strategies according to the practical demands of government agencies for public management and public service.Finally,a case study is presented to illustrate the operation of the proposed framework.
文摘A novel method for noise removal from the rotating accelerometer gravity gradiometer(MAGG)is presented.It introduces a head-to-tail data expansion technique based on the zero-phase filtering principle.A scheme for determining band-pass filter parameters based on signal-to-noise ratio gain,smoothness index,and cross-correlation coefficient is designed using the Chebyshev optimal consistent approximation theory.Additionally,a wavelet denoising evaluation function is constructed,with the dmey wavelet basis function identified as most effective for processing gravity gradient data.The results of hard-in-the-loop simulation and prototype experiments show that the proposed processing method has shown a 14%improvement in the measurement variance of gravity gradient signals,and the measurement accuracy has reached within 4E,compared to other commonly used methods,which verifies that the proposed method effectively removes noise from the gradient signals,improved gravity gradiometry accuracy,and has certain technical insights for high-precision airborne gravity gradiometry.
基金Project(cstc2020jcyj-bshX0106)supported by the Chongqing Postdoctoral Science Foundation,ChinaProject(2020M683247)supported by the China Postdoctoral Science Foundation+1 种基金Project(cstc2020jcyj-zdxmX0023)supported by the Key Natural Science Foundation Project of Chongqing,ChinaProject(551974043)supported by the National Natural Science Foundation of China。
文摘Source location is the core foundation of microseismic monitoring.To date,commonly used location methods have usually been based on the ray-tracing travel-time technique,which generally adopts an L1 or L2 norm to establish the location objective function.However,the L1 norm usually achieves low location accuracy,whereas the L2 norm is easily affected by large P-wave arrival-time picking errors.In addition,traditional location methods may be affected by the initial iteration point used to find a local optimum location.Furthermore,the P-wave arrival-time data that have travelled long distances are usually poor in quality.To address these problems,this paper presents a microseismic source location method using the Log-Cosh function and distant sensor-removed P-wave arrival data.Its basic principles are as follows:First,the source location objective function is established using the Log-Cosh function.This function has the stability of the L1 norm and location accuracy of the L2 norm.Then,multiple initial points are generated randomly in the mining area,and the established Log-Cosh location objective function is used to obtain multiple corresponding location results.The average value of the 50 location points with the largest data field potential values is treated as the initial location result.Next,the P-wave travel times from the initial location result to triggered sensors are calculated,and then the P-wave arrival data with travel times exceeding 0.2 s are removed.Finally,the aforementioned location steps are repeated with the denoised P-wave arrival dataset to obtain a high-precision location result.Two synthetic events and eight blasting events from the Yongshaba mine,China,were used to test the proposed method.Regardless of whether the P-wave arrival data with long travel times were eliminated,the location error of the proposed method was smaller than that of the L1/L2 norm and trigger-time-based location method(TT1/TT2 method).Furthermore,after eliminating the Pwave arrival data with long travel distances,the location accuracy of these three location methods increased,indicating that the proposed location method has good application prospects.
文摘Based on the relationship among the geographic events, spatial changes and the database operations, a new automatic (semi-automatic) incremental updating approach of spatio-temporal database (STDB) named as (event-based) incremental updating (E-BIU) is proposed in this paper. At first, the relationship among the events, spatial changes and the database operations is analyzed, then a total architecture of E-BIU implementation is designed, which includes an event queue, three managers and two sets of rules, each component is presented in detail. The process of the E-BIU of master STDB is described successively. An example of building’s incremental updating is given to illustrate this approach at the end. The result shows that E-BIU is an efficient automatic updating approach for master STDB.
基金Projects(61363021,61540061,61663047)supported by the National Natural Science Foundation of ChinaProject(2017SE206)supported by the Open Foundation of Key Laboratory in Software Engineering of Yunnan Province,China
文摘Due to the increasing number of cloud applications,the amount of data in the cloud shows signs of growing faster than ever before.The nature of cloud computing requires cloud data processing systems that can handle huge volumes of data and have high performance.However,most cloud storage systems currently adopt a hash-like approach to retrieving data that only supports simple keyword-based enquiries,but lacks various forms of information search.Therefore,a scalable and efficient indexing scheme is clearly required.In this paper,we present a skip list-based cloud index,called SLC-index,which is a novel,scalable skip list-based indexing for cloud data processing.The SLC-index offers a two-layered architecture for extending indexing scope and facilitating better throughput.Dynamic load-balancing for the SLC-index is achieved by online migration of index nodes between servers.Furthermore,it is a flexible system due to its dynamic addition and removal of servers.The SLC-index is efficient for both point and range queries.Experimental results show the efficiency of the SLC-index and its usefulness as an alternative approach for cloud-suitable data structures.
基金Project(2001G025) supported by the Foundation of the Science and Technology Section of Ministry of Rail way of Chinaproject(2005) supported by the Postdoctoral Foundation of Central South University
文摘To evaluate the fatigue damage reliability of critical members of the Nanjing Yangtze river bridge, according to the stress-number curve and Miner’s rule, the corresponding expressions for calculating the structural fatigue damage reliability were derived. Fatigue damage reliability analysis of some critical members of the Nanjing Yangtze river bridge was carried out by using the strain-time histories measured by the structural health monitoring system of the bridge. The corresponding stress spectra were obtained by the real-time rain-flow counting method. Results of fatigue damage were calculated respectively by the reliability method at different reliability and compared with Miner’s rule. The results show that the fatigue damage of critical members of the Nanjing Yangtze river bridge is very small due to its low live-load stress level.
基金Foundation item: Projects(70901074, 71001104, 71201168) supported by the National Natural Science Foundation of China
文摘A variable weight approach was proposed to handle the probability deficiency problem in the evidential reasoning (ER) approach. The probability deficiency problem indicated that the inadequate information in the assessment result should be less than that in the input. However, it was proved that under certain circumstances, the ER approach could not solve the probability deficiency problem. The variable weight approach was based on two assumptions: 1) the greater weight should be given to the rule with more adequate information; 2) the greater weight should be given to the rules with less disparate information. Assessment results of two notional case studies show that 1) the probability deficiency problem is solved using the proposed variable weight approach, and 2) the information with less inadequacy and more disparity is provided for the decision makers to help reach a consensus.
基金Project(2017YFC1405600)supported by the National Key R&D Program of ChinaProject(18JK05032)supported by the Scientific Research Project of Education Department of Shaanxi Province,China。
文摘Due to the limited scenes that synthetic aperture radar(SAR)satellites can detect,the full-track utilization rate is not high.Because of the computing and storage limitation of one satellite,it is difficult to process large amounts of data of spaceborne synthetic aperture radars.It is proposed to use a new method of networked satellite data processing for improving the efficiency of data processing.A multi-satellite distributed SAR real-time processing method based on Chirp Scaling(CS)imaging algorithm is studied in this paper,and a distributed data processing system is built with field programmable gate array(FPGA)chips as the kernel.Different from the traditional CS algorithm processing,the system divides data processing into three stages.The computing tasks are reasonably allocated to different data processing units(i.e.,satellites)in each stage.The method effectively saves computing and storage resources of satellites,improves the utilization rate of a single satellite,and shortens the data processing time.Gaofen-3(GF-3)satellite SAR raw data is processed by the system,with the performance of the method verified.
基金Project(20030533011)supported by the National Research Foundation for the Doctoral Program of Higher Education of China
文摘A DMVOCC-MVDA (distributed multiversion optimistic concurrency control with multiversion dynamic adjustment) protocol was presented to process mobile distributed real-time transaction in mobile broadcast environments. At the mobile hosts, all transactions perform local pre-validation. The local pre-validation process is carried out against the committed transactions at the server in the last broadcast cycle. Transactions that survive in local pre-validation must be submitted to the server for local final validation. The new protocol eliminates conflicts between mobile read-only and mobile update transactions, and resolves data conflicts flexibly by using multiversion dynamic adjustment of serialization order to avoid unnecessary restarts of transactions. Mobile read-only transactions can be committed with no-blocking, and respond time of mobile read-only transactions is greatly shortened. The tolerance of mobile transactions of disconnections from the broadcast channel is increased. In global validation mobile distributed transactions have to do check to ensure distributed serializability in all participants. The simulation results show that the new concurrency control protocol proposed offers better performance than other protocols in terms of miss rate, restart rate, commit rate. Under high work load (think time is ls) the miss rate of DMVOCC-MVDA is only 14.6%, is significantly lower than that of other protocols. The restart rate of DMVOCC-MVDA is only 32.3%, showing that DMVOCC-MVDA can effectively reduce the restart rate of mobile transactions. And the commit rate of DMVOCC-MVDA is up to 61.2%, which is obviously higher than that of other protocols.
基金Project(2009AA11Z220)supported by the National High Technology Research and Development Program of China
文摘Video processing is one challenge in collecting vehicle trajectories from unmanned aerial vehicle(UAV) and road boundary estimation is one way to improve the video processing algorithms. However, current methods do not work well for low volume road, which is not well-marked and with noises such as vehicle tracks. A fusion-based method termed Dempster-Shafer-based road detection(DSRD) is proposed to address this issue. This method detects road boundary by combining multiple information sources using Dempster-Shafer theory(DST). In order to test the performance of the proposed method, two field experiments were conducted, one of which was on a highway partially covered by snow and another was on a dense traffic highway. The results show that DSRD is robust and accurate, whose detection rates are 100% and 99.8% compared with manual detection results. Then, DSRD is adopted to improve UAV video processing algorithm, and the vehicle detection and tracking rate are improved by 2.7% and 5.5%,respectively. Also, the computation time has decreased by 5% and 8.3% for two experiments, respectively.
基金Project(50374079) supported by the National Natural Science Foundation of China
文摘It is difficult to detect the anomalies whose matching relationship among some data attributes is very different from others’ in a dataset. Aiming at this problem, an approach based on wavelet analysis for detecting and amending anomalous samples was proposed. Taking full advantage of wavelet analysis’ properties of multi-resolution and local analysis, this approach is able to detect and amend anomalous samples effectively. To realize the rapid numeric computation of wavelet translation for a discrete sequence, a modified algorithm based on Newton-Cores formula was also proposed. The experimental result shows that the approach is feasible with good result and good practicality.
基金Project(11174235)supported by the National Natural Science Foundation of ChinaProject(3102014JC02010301)supported by the Fundamental Research Funds for the Central Universities,China
文摘The use of underwater acoustic data has rapidly expanded with the application of multichannel, large-aperture underwater detection arrays. This study presents an underwater acoustic data compression method that is based on compressed sensing. Underwater acoustic signals are transformed into the sparse domain for data storage at a receiving terminal, and the improved orthogonal matching pursuit(IOMP) algorithm is used to reconstruct the original underwater acoustic signals at a data processing terminal. When an increase in sidelobe level occasionally causes a direction of arrival estimation error, the proposed compression method can achieve a 10 times stronger compression for narrowband signals and a 5 times stronger compression for wideband signals than the orthogonal matching pursuit(OMP) algorithm. The IOMP algorithm also reduces the computing time by about 20% more than the original OMP algorithm. The simulation and experimental results are discussed.
文摘A new file assignment strategy of parallel I/O, which is named heuristic file sorted assignment algorithm was proposed on cluster computing system. Based on the load balancing, it assigns the files to the same disk according to the similar service time. Firstly, the files were sorted and stored at the set I in descending order in terms of their service time, then one disk of cluster node was selected randomly when the files were to be assigned, and at last the continuous files were taken orderly from the set I to the disk until the disk reached its load maximum. The experimental results show that the new strategy improves the performance by 20.2% when the load of the system is light and by 31.6% when the load is heavy. And the higher the data access rate, the more evident the improvement of the performance obtained by the heuristic file sorted assignment algorithm.
文摘Based on the scalar wave equation, making use of the ray approximation of the reflected seismic data (CMP or CSP gathers), the authors derive respectively the projection function of the primary waves and multiple waves at the near offset (CMP or CSP gathers) in the parabolic Radon transform(PRT)domain. From the geometric point, the authors prove that the energy of the reflection still distributes along hyperbola which has higher curvature in the PRT domain and becomes some energy masses. So the primary waves and the multiple waves which interweave each other in ( x, t ) domain can be completely separated, which helps the multiple waves eliminated by filtering or muting. It is important for the analysis of velocity and the separator and elimination of multiple waves.