Extracting Depth and Matte using a Color-Filtered Aperture

Yosuke Bando Bing-Yu Chen Tomoyuki Nishita
TOSHIBA Corporation
The University of Tokyo
      National Taiwan University       The University of Tokyo

This paper presents a method for automatically extracting a scene depth map and the alpha matte of a foreground object by capturing a scene through RGB color filters placed in the camera lens aperture. By dividing the aperture into three regions through which only light in one of the RGB color bands can pass, we can acquire three shifted views of a scene in the RGB planes of an image in a single exposure. In other words, a captured image has depth-dependent color misalignment. We develop a color alignment measure to estimate disparities between the RGB planes for depth reconstruction. We also exploit color misalignment cues in our matting algorithm in order to disambiguate between the foreground and background regions even where their colors are similar. Based on the extracted depth and matte, the color misalignment in the captured image can be canceled, and various image editing operations can be applied to the reconstructed image, including novel view synthesis, post-exposure refocusing, and composition over different backgrounds.

Keywords: computational photography, computational camera, color filters, color correlation, depth estimation, alpha matting