期刊文献+

采用双目视觉和自适应Kalman滤波的作物行识别与跟踪 被引量:13

Crop row detection and tracking based on binocular vision and adaptive Kalman filter
在线阅读 下载PDF
导出
摘要 针对传统作物行识别方法在相邻图像间的识别结果偏差较大,作物行的定位精度和稳定性低等问题,该研究提出一种基于双目视觉和自适应Kalman滤波技术的作物行识别与跟踪方法。对于作物行识别,首先建立图像预处理算法,基于改进的超绿-超红模型和最大类间方差法分割植被灰度特征;建立作物行特征提取算法,基于特征点检测技术和双目视差测距方法计算植被角点特征的三维坐标,根据三维阈值提取作物行特征点,进而建立作物行中心线检测算法,建立基于主成分分析的直线拟合模型,根据作物行特征点的频数统计规律检测作物行冠层中心线。对于作物行跟踪,建立跟踪目标规划模型,提取位于图像中央区域的作物行作为跟踪目标;建立目标状态方程,基于自适应Kalman滤波技术构建作物行中心线跟踪模型。以棉花图像开展试验研究,图像数据包括阴影、杂草、地头等田间场景。试验结果表明,该研究方法的作物行识别准确度、精度和速度均较高,识别正确率约为92.36%,平均航向偏差为0.31°、标准差为2.55°,平均识别速度约80.25 ms/帧;经目标跟踪后,航向角和横向位置估计的标准差分别为2.62°和0.043m、较无跟踪状态分别减小22.94%和10.42%,作物行中心线的方位估计精度进一步提高。研究成果可为导航系统提供连续、稳定的作物行导引参数。 A crop row in field has been an indicator of road information for vehicle navigation in intelligent agriculture.However, the current detection for the crop row cannot suit the complex field. The precision and accuracy of crop row detection are also sensitive to environmental interference. In this study, an improved detection was proposed to track the crop row for the navigation parameters using binocular vision and an adaptive Kalman filter. The improved system consisted of crop row detection and tracking. Firstly, an image preprocessing algorithm was established to segment the gray features of vegetation using the improved Excess Green Minus Excess Red(Ex G-ExR) model and the maximum interclass variance method(OTSU). Corner points in the grayscale image were extracted to describe the shape of the crop row canopy using the smallest univalue Segment Assimilating Nucleus(SUSAN) detector. The three-dimensional coordinate of the corner point was computed using stereo vision with stereo matching and parallax ranging. Secondly, the points with three-dimensional coordinates within the threshold range were extracted to serve as the crop row feature points. A line model was established to indicate the centerline of the crop row using the Principal Component Analysis(PCA). The fit points of straight line were distinguished, according to the frequency square analysis of feature points. After detecting the centerlines of crop rows, a target planning algorithm was designed to extract the crop row located in the central area of the image for crop row tracking. Thirdly,the pathway was taken as the tracking target. A linear system was simplified for the process of crop row detection, considering the agronomy of field crops. Finally, the model of crop row tracking was established using the Sage-Husa Kalman filter. The videos of the cotton field were captured to verify the improved system using a binocular camera(BB2-08S2C-25). The image data consisted of shadows, weeds, turnrows, uneven roads and other field scenes. The result showed that accurate and rapid detection was realized for the centerline of the crop row using the improved system. The higher performance of crop row detection was achieved than before, where the accuracy and average duration were 92.36% and 80.25 ms per frame.Specifically, the average, the standard deviation, and the maximum absolute deviation of centerline head angle were 0.31°,2.55°, and 12.01°, respectively, whereas, the average, the standard deviation and the maximum absolute deviation of centerline lateral deviation were-1.90, 8.19, and 38.18 pixels, respectively. More importantly, the crop row tracking using Sage-Husa Kalman performed much better than the classical Kalman-based tracking, indicating the rapid tracking without hysteresis to correct the false or missed centerline detection of crop row. The heading angle was defined as the deviation angle between the target and the current heading of the vehicle. The viewpoint intercept was the transverse intercept at 1m depth of field, which was used to describe the target position. The direction and position accuracy of centerline estimation were improved after the crop row tracking, compared with the simple crop row detection. In the performance of crop row tracking,the standard deviation of heading angle was 2.62°, and the maximum amplitude was 9.19°, which were reduced by 22.94%and 43.69%, respectively, compared with the non-tracking state;the standard deviation of viewpoint intercept was 0.043 m and the maximum amplitude was 0.145 m, which was reduced by 10.42% and 5.23%, respectively;the standard deviation of the difference between heading angle and viewpoint intercept of the centerline in two adjacent frames were reduced by 67.02%and 40.00%, respectively. Consequently, the improved system can quickly and accurately perceive the heading and position of the crop row. The finding can provide continuous and stable guidance parameters for the navigation system.
作者 翟志强 熊坤 王亮 杜岳峰 朱忠祥 毛恩荣 Zhai Zhiqiang;Xiong Kun;Wang Liang;Du Yuefeng;Zhu Zhongxiang;Mao Enrong(College of Engineering,China Agricultural University,Beijing 100083,China)
出处 《农业工程学报》 EI CAS CSCD 北大核心 2022年第8期143-151,共9页 Transactions of the Chinese Society of Agricultural Engineering
基金 国家自然科学基金项目(32101622)。
关键词 导航 机器视觉 图像处理 大田作物 作物行识别 作物行跟踪 线性状态观测 navigation machine vision image processing field crops crop row detection crop row tracking linear state observation
作者简介 翟志强,博士,讲师,研究方向为农机装备自主导航技术。Email:zhaizhiqiang@cau.edu.cn。
  • 相关文献

参考文献11

二级参考文献125

共引文献123

同被引文献189

引证文献13

二级引证文献22

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部