Real Time Dense Depth Estimation by Fusing Stereo with Sparse Depth Measurements

Published in 2019 International Conference on Robotics and Automation (ICRA), 2019

Recommended citation: S. S. Shivakumar, K. Mohta, B. Pfrommer, V. Kumar and C. J. Taylor, "Real Time Dense Depth Estimation by Fusing Stereo with Sparse Depth Measurements," 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 2019, pp. 6482-6488. doi: 10.1109/ICRA.2019.8794023 https://ieeexplore.ieee.org/abstract/document/8794023

We present an approach to depth estimation that fuses information from a stereo pair with sparse range measurements derived from a LIDAR sensor or a range camera. The goal of this work is to exploit the complementary strengths of the two sensor modalities, the accurate but sparse range measurements and the ambiguous but dense stereo information. These two sources are effectively and efficiently fused by combining ideas from anisotropic diffusion and semi-global matching.We evaluate our approach on the KITTI 2015 and Middlebury 2014 datasets, using randomly sampled ground truth range measurements as our sparse depth input. We achieve significant performance improvements with a small fraction of range measurements on both datasets. We also provide qualitative results from our platform using the PMDTec Monstar sensor. Our entire pipeline runs on an NVIDIA TX-2 platform at 5Hz on 1280×1024 stereo images with 128 disparity levels.

Download paper here

Recommended citation: Shivakumar, Shreyas S., Kartik Mohta, Bernd Pfrommer, Vijay Kumar, and Camillo J. Taylor. “Real time dense depth estimation by fusing stereo with sparse depth measurements.” In 2019 International Conference on Robotics and Automation (ICRA), pp. 6482-6488. IEEE, 2019.