Achuta Kadambi1
Ayush Bhandari1
Refael Whyte2
Adrian Dorrington2
Ramesh Raskar1 1Massachusetts Institute of Technology 2University of Waikato
|
IEEE ICCP 2014, Santa Clara CA
Several computer vision algorithms require a sequence of photographs taken in different illumination conditions, which has spurred development in the area of illumination multiplexing. Various techniques for optimizing the multiplexing process already exist, but are geared toward regular or high speed cameras. Such cameras are fast, but code on the order of milliseconds. In this paper we propose a fusion of two popular contexts, time of flight range cameras and illumination multiplexing. Time of flight cameras are a low cost, consumer-oriented technology capable of acquiring range maps at 30 frames per second. Such cameras have a natural connection to conventional illumination multiplexing strategies as both paradigms rely on the capture of multiple shots and synchronized illumination. While previous work on illumination multiplexing has exploited coding at millisecond intervals, we repurpose sensors that are ordinarily used in time of flight imaging to demultiplex via nanosecond coding strategies.
What is the main contribution of this project?
Many computer vision algorithms require photographs of a scene under different light sources (e.g. photometric stereo). There are ways to optimize the collection process: this is the multiplexed illumination problem in computer vision.
However, existing work is tailored toward conventional cameras. We derive a multiplexing
model for Time of flight cameras, which are increasingly popular 3D cameras that form the basis for the new Microsoft Kinect.
What exactly is illumination multiplexing?
Suppose we have 3 lights, A, B, and C illuminating a scene. If we take a photograph, we measure a combination of the three lights (this is the forward model). The inverse problem is to capture images as if only a single light was on. This is not a new problem
and has been studied by Schechner's group at Technion (among others).
Why is illumination multiplexing tough for Time of Flight Cameras?
Time of Flight cameras measure range by frequency locking to a strobed illumination source and measuring the phase shift of the modulation signal. Such cameras are designed to lock-on to one light source at the modulation frequency.
What are some consumer applications of this work?
We validate our technique by constructing a prototype camera. One application is to multiplex colored light sources to create the only 3D color time of flight camera.
Does this work in real-time?
Yes. The results are demonstrated in real-time. See paper for details.