Kinect range sensing: Structured-light versus Time-of-Flight Kinect

https://doi.org/10.1016/j.cviu.2015.05.006Get rights and content

Highlights

  • This work compares Kinect Structured-Light with Kinect Time-of-Flight cameras.

  • The results offer descriptions under which condition one is superior to the other.

  • Solid insight of the devices is given to make decisions on their application.

  • We propose a set of nine tests for comparing both Kinects, five of which are novel.

Abstract

Recently, the new Kinect One has been issued by Microsoft, providing the next generation of real-time range sensing devices based on the Time-of-Flight (ToF) principle. As the first Kinect version was using a structured light approach, one would expect various differences in the characteristics of the range data delivered by both devices.

This paper presents a detailed and in-depth comparison between both devices. In order to conduct the comparison, we propose a framework of seven different experimental setups, which is a generic basis for evaluating range cameras such as Kinect. The experiments have been designed with the goal to capture individual effects of the Kinect devices as isolatedly as possible and in a way, that they can also be adopted, in order to apply them to any other range sensing device. The overall goal of this paper is to provide a solid insight into the pros and cons of either device. Thus, scientists who are interested in using Kinect range sensing cameras in their specific application scenario can directly assess the expected, specific benefits and potential problem of either device.

Section snippets

Introduction and related works

In the last decade, several new range sensing devices have been developed and have been made available for application development at affordable costs. In 2010, Microsoft, in cooperation with PrimeSense released a structured-light (SL) based range sensing camera, the so-called Kinect™, that delivers reliable depth images at VGA resolution at 30 Hz, coupled with an RGB-color camera at the same image resolution. Even though the camera was mainly designed for gaming, it achieved great popularity

Structured light cameras: KinectSL

Even though the principle of structured light (SL) range sensing is comparatively old, the launch of the Microsoft Kinect™ (KinectSL) in 2010 as interaction device for the XBox 360 clearly demonstrates the maturity of the underlying principle.

General considerations for comparing KinectSL and KinectToF

Before presenting the experimental setups and the comparison between the two Kinect devices, we have to consider the limitations which this kind of comparison encounters. For both, the KinectSL and the KinectToF cameras, there are no official, publicly available reference implementations which explain all stages from raw data acquisition to the final range data delivery. Thus, any effect observed may either relate to the sensor hardware, i.e. to the measurement principle as such, or to the

Experimental results and comparison

In Sections 4.2–4.8 we present the different test scenarios we designed in order to capture specific error sources of the KinectSL and the KinectToF-cameras. Before going into the scenarios, in Section 4.1 we will briefly present the camera parameters and the pixel statistics.

Our major design goal for the test scenarios was to capture individual effects as isolatedly as possible. Furthermore, we designed the scenarios in a way that they can be reproduced in order to adopt them to any other

Conclusion

This paper presents an in-depth comparison between the two versions of the Kinect range sensor, i.e. the KinectSL, which is based on the Structured Light principle, and the new Time-of-Flight variant KinectToF. We present a framework for evaluating Structured Light and Time-of-Flight cameras, such as the two Kinect variants, for which we give detailed insight here. Our evaluation framework consists of seven experimental setups that cover the full range of known artifacts for these kinds of

Acknowledgments

This research was partially funded by our collaboration partner Delphi Deutschland GmbH. The authors would like to thank Microsoft Inc. for making the prototype of the KinectToF-cameras available via the Kinect For Windows Developer Preview Program (K4W DPP) and Dr. Rainer Bornemann from the Center for Sensor Systems of Northrhine-Westphalia (ZESS), Siegen, for the reference measurements of the illumination signal for the KinectToF camera and for the support in measuring the ambient

References (57)

  • M. Lindner et al.

    Time-of-flight sensor calibration for accurate range sensing

    Comput. Vision Image Understanding

    (2010)
  • T. Stoyanov et al.

    Comparative evaluation of range sensor accuracy for indoor mobile robotics and automated logistics applications

    Rob. Auton. Syst.

    (2013)
  • S. Bauer et al.

    Real-time range imaging in health care: A survey

  • C. Beder et al.

    A comparison of PMD-cameras and stereo-vision for the task of surface reconstruction using patchlets

    Proceedings of the IEEE International Conference on Computer Vision and Pattern Recognition, 2007

    (2007)
  • K. Berger et al.

    A state of the art report on kinect sensor setups in computer vision

  • K. Berger et al.

    Markerless motion capture using multiple color-depth sensors

    Proceedings of the Vision, Modeling and Visualization (VMV)

    (2011)
  • J. Blake, F. Echtler, C. Kerl, OpenKinect: open source drivers for the kinect for windows v2 device, 2015...
  • G.R. Bradski

    The OpenCV library

    Dr. Dobb’s J. Software Tools

    (2000)
  • A. Butler et al.

    Shake’n’sense: reducing structured light interference when multiple depth cameras overlap

    Proceedings of the Human Factors in Computing Systems (ACM CHI)

    (2012)
  • A. Dorrington et al.

    Separating true range measurements from multi-path and scattering interference in commercial range cameras

    Proceedings of the IS&T/SPIE Electronic Imaging

    (2011)
  • D. Droeschel et al.

    Multi-frequency phase unwrapping for time-of-flight cameras

    2010 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)

    (2010)
  • R. El-laithy et al.

    Study on the use of Microsoft kinect for robotics applications

    Proceedings of the IEEE Symposium on Position Location and Navigation (PLANS)

    (2012)
  • G. Evangelidis, M. Hansard, R. Horaud, Fusion of range and stereo data for high-resolution scene-modeling...
  • D. Falie et al.

    Distance errors correction for the time of flight (ToF) cameras

    Proceedings of European Conference on Circuits and Systems for Communications

    (2008)
  • P. Fechteler et al.

    Fast and high resolution 3D face scanning

    International Conference on Image Processing (ICIP)

    (2007)
  • D. Fiedler et al.

    Impact of thermal and environmental conditions on the kinect sensor

    Advances in Depth Image Analysis and Applications

    (2013)
  • L. Gallo et al.

    Controller-free exploration of medical image data: experiencing the kinect

    Proceedings of the IEEE International Symposium on Computer-Based Medical Systems (CBMS)

    (2011)
  • O. Hall-Holt et al.

    Stripe boundary codes for real-time structured-light range scanning of moving objects

    Proceedings of the IEEE International Conference on Computer Vision (ICCV)

    (2001)
  • J. Han et al.

    Enhanced computer vision with Microsoft Kinect sensor: a review

    IEEE Trans. Cybern.

    (2013)
  • M. Hansard et al.

    Time-of-Flight Cameras: Principles, Methods, and Applications

    (2013)
  • D. Herrera C. et al.

    Joint depth and color camera calibration with distortion correction

    IEEE Trans. Pattern Anal. Mach. Intell.

    (2012)
  • T. Högg et al.

    Real-time motion artifact compensation for PMD-ToF images

    Proceedings of the Workshop on Imaging New Modalities, German Conference of Pattern Recognition (GCPR)

    (2013)
  • I. Ihrke et al.

    Transparent and specular object reconstruction

    Computer Graphics Forum

    (2010)
  • A. Kadambi et al.

    Coded time of flight cameras: sparse deconvolution to address multipath interference and recover time profiles

    ACM Trans. Graphics (TOG)

    (2013)
  • T. Kahlmann et al.

    Range imaging technology: new developments and applications for people identification and tracking

    Proceedings of Videometrics IX - SPIE-IS&T Electronic Imaging

    (2007)
  • T. Kahlmann et al.

    Calibration for increased accuracy of the range imaging camera SwissRanger™

    Image Eng. Vision Metrol. (IEVM)

    (2006)
  • M. Keller et al.

    Real-time 3D reconstruction in dynamic scenes using point-based fusion

    Proceedings of Joint 3DIM/3DPVT Conference (3DV)

    (2013)
  • K. Khoshelham et al.

    Accuracy and resolution of kinect depth data for indoor mapping applications

    Sensors

    (2012)
  • Cited by (0)

    This paper has been recommended for acceptance by Pushmeet Kohli.

    View full text