张敦凤, 张华, 王姮, 张静, 徐小龙. 改进rank变换的多窗口彩色图像立体匹配算法[J]. 云南大学学报(自然科学版), 2014, 36(3): 347-352. doi: 10.7540/j.ynu.20130408
引用本文: 张敦凤, 张华, 王姮, 张静, 徐小龙. 改进rank变换的多窗口彩色图像立体匹配算法[J]. 云南大学学报(自然科学版), 2014, 36(3): 347-352. doi: 10.7540/j.ynu.20130408
ZHANG Dun-feng, ZHANG Hua, WANG Heng, ZHANG Jing, XU Xiao-long. Multi-window color image stereo matching algorithm based on improved rank transform[J]. Journal of Yunnan University: Natural Sciences Edition, 2014, 36(3): 347-352. DOI: 10.7540/j.ynu.20130408
Citation: ZHANG Dun-feng, ZHANG Hua, WANG Heng, ZHANG Jing, XU Xiao-long. Multi-window color image stereo matching algorithm based on improved rank transform[J]. Journal of Yunnan University: Natural Sciences Edition, 2014, 36(3): 347-352. DOI: 10.7540/j.ynu.20130408

改进rank变换的多窗口彩色图像立体匹配算法

Multi-window color image stereo matching algorithm based on improved rank transform

  • 摘要: 为克服环境变化和双目相机的差异带来的干扰,针对传统立体匹配算法在视差不连续区域存在误匹配率高等问题,提出一种在rank变换域中对彩色图像进行多窗口匹配的算法.首先对标准图像库中的彩色图像进行分等级的改进rank变换,将图像从彩色空间变换到秩空间;然后采用改进的绝对值指数方法计算像素的颜色相似性,减少噪声和光线差异的干扰;最后利用快速多窗口算法进行匹配.实验证明该文算法有较强的抗噪性,能获得实时鲁棒的匹配结果.相对于固定窗口非参数变换匹配算法在视差不连续区域匹配精度提高了18.5%.

     

    Abstract: A novel stereo matching algorithm based on multi-window in rank transform domain for color image was proposed in this paper.It could overcome the interference of the environment variations and binocular cameras and reduce the matching error rate in depth discontinuous region.Firstly,we transformed color image from the standard image database into an integer matrix with improved rank transform.So the image was transformed from color space to rank space.Secondly,a new method calculated the color similarity with improved absolute-value exponent method.And the problems such as image noise and light difference in stereo images were solved.At last,disparity was estimated by multi-window algorithm with fast searching strategy.The experimental results showed that this algorithm could obtain robust matching results with strong anti-noise strong feature in real time.Meanwhile,it achieved an improvement for accuracy at 18.5% in depth discontinuous areas.

     

/

返回文章
返回