摘要
近年来,基于稀疏表示的DOA估计方法已经被广泛提出,这些方法都需预设离散的网格点,而实际信号来波方向在空间域内具有随机性,任何来波方向都是等概率出现,很有可能信号的来波方向不在网格上,因而会存在网格误差,使DOA估计结果产生较大偏差。为提高DOA估计精度,本文提出了非网格的DOA估计模型。同时,为提高测向自由度,本文应用由两个均匀线阵组成的互质阵列,并且将两个均匀线阵平行放置在同一平面。通过将两均匀线阵的互协方差矩阵向量化成互协方差矢量,可得到一维虚拟扩展的接收数据矢量,并且在稀疏表示框架下应用相应的稀疏恢复算法恢复出跟DOA参数相关的向量,从该向量中得到唯一的并且自动配对的二维DOA估计参数。仿真实验结果验证了本文算法较传统算法具有更好的DOA估计性能。
Recently, the DOA estimation methods based on sparse representation have been widely proposed. These methods all require the pre-determined discrete grid points. However, the actual signal arrival direction has randomness in the spatial domain, and any direction of incoming wave appears with equal probability. It is possible that the incoming wave direction of the signal is not on the grid, so grid error will occur and the DOA estimation result will have a large deviation. In order to improve the DOA estimation accuracy, an off-grid DOA estimation model is proposed. At the same time, in order to improve the freedom of direction-finding, a coprime array composed of two uniform linear arrays is applied, and the two uniform linear arrays are placed in the same plane in parallel. By vectorizing the cross-covariance matrices of the two uniform linear arrays into cross-covariance vectors, a one-dimensional virtual extended received data vector can be obtained, and corresponding sparse recovery algorithms can be used to recover the vectors related to the DOA parameters in the sparse representation framework. From this vector, unique and automatically paired 2D DOA estimation parameters are available. The simulation experiment results verify the effectiveness of the proposed algorithm and have better DOA estimation performance than traditional algorithms.
作者
曾富红
司伟建
彭占立
Zeng Fuhong;Si Weijian;Peng Zhanli(College of Information and Communication Engineering, Harbin Engineering University, Harbin 150001, China)
出处
《航空兵器》
CSCD
北大核心
2019年第3期27-32,共6页
Aero Weaponry
基金
国家自然科学基金项目(61671168)
黑龙江省自然科学基金项目(QC2016085)
中央高校基金项目(HEUCF180801
HEUCFJ180801)
作者简介
曾富红(1993-),女,湖南衡阳人,博士研究生,研究方向为互质阵列下的DOA估计。E-mail:fuhongzeng@gmail.com.