摘要
针对单一雷达传感器目标属性识别能力低的问题,提出基于D-S证据理论的雷达航迹与光电图像信息融合的目标属性识别方法,对光电图像和雷达航迹特征分别使用ResNet网络和XGBoost网络进行目标属性识别,将得到的类别概率赋值通过D-S组合规则融合得到最终的目标属性识别结果。实验研究表明:无论是在远距离或近距离目标属性识别能力上,融合后模型的识别能力均比融合前单一模型的识别能力强,且融合后的模型能够矫正因为单一模型识别错误而导致最终识别结果错误的问题,融合后的模型在测试集上各类别的平均召回率比光电图像分类模型提高了3%,比雷达航迹分类模型提高了10%,融合后模型的平均召回率为95%。
In response to the problem of low target attribute recognition capability of single radar sensor,a target attribute recognition method based on D-S evidence theory of the information fusion of radar trajectory and photoelectric image is proposed.ResNet network and XGBoost network are used for target attribute recognition of photoelectric images and radar trajectory features respectively,and the obtained category probability assignments are fused by D-S combination rules to obtain the final target attribute recognition results.The experimental study shows that the fused model has better recognition capability than the single model before fusion in both long-range or close-range target attribute recognition,and the fused model can correct the problem of incorrect recognition results caused by the single model.The average recall of the fused model for each category on the test set improved by 3%over the photoelectric image classification model and by 10%over the radar track classification model,with an average recall of 95%for the fused model.
作者
李正东
杨帆
王长城
周颖玥
LI Zhengdong;YANG Fan;WANG Changcheng;ZHOU Yingyue(School of Information Engineering,Southwest University of Science and Technology,Mianyang 621000,China;Automation Research Institute Co.,Ltd.,of China South Industries Group Corporation,Mianyang 621000,China;Sichuan Provincial Key Laboratory of Robot Technology Used for Special Environment,Southwest University of Science and Technology,Mianyang 621000,China)
出处
《兵器装备工程学报》
CAS
CSCD
北大核心
2024年第2期232-237,共6页
Journal of Ordnance Equipment Engineering
基金
“十四五”预研基金项目(628010205)
四川省科技计划资助(2021YFG0383)
西南科技大学“课程思政”示范课程建设项目(21szkc52)。
作者简介
李正东(1997-),男,工学硕士,E-mail:493139639@qq.com。;通信作者:周颖玥,女,副研究员,E-mail:zhouyingyue@swust.edu.cn。