Figure 1: Starting from a coarse depth map, is it possible to achieve laser scan quality? By combining the information from the Kinect depth frame in (a) with information in 3 polarized photographs (b) , we reconstruct the 3D surface shown in (c). The subtle change between polarization images provides additional information about surface orientation. See Figure 2 of this website for a laser scan comparison.
Polarized 3D: High-Quality Depth Sensing with Polarization Cues, ICCV 2015 [PDF] [PDF Low Res]
Coarse depth maps can be enhanced by using the shape information from polarization cues. We propose a framework to combine surface normals from polarization (hereafter polarization normals) with an aligned depth map. Polarization normals have not been used for depth enhancement before. This is because polarization normals suffer from physics-based artifacts, such as azimuthal ambiguity, refractive distortion and fronto-parallel signal degradation. We propose a framework to overcome these key challenges, allowing the benefits of polarization to be used to enhance depth maps. Our results demonstrate improvement with respect to state-of-the-art 3D reconstruction techniques.
author = "Achuta Kadambi and Vage Taamazyan and Boxin Shi and Ramesh Raskar,
title = "Polarized 3D: High-Quality Depth Sensing with Polarization Cues",
booktitle = "International Conference on Computer Vision (ICCV)",
year = "2015"
What is Polarized 3D?
Today, photographers use polarizing filters on 2D cameras to create stunning photos. Polarized 3D probes the question: what if a polarizing filter is used on a 3D camera? The answer: commodity depth sensors operating at millimeter quality, can be enhanced to micron quality, improving resolution to 3 orders of magnitude.
How does a polarized 2D camera obtain 3D geometry?
For about two centuries, the Fresnel equations have linked surface normals with material and polarimetric properties. However, such equations alone cannot solve for full 3D geometry. Our work is inspired by previous approaches in inverse rendering [Miyazaki03] and shape recovery [Atkinson06], but has a different goal: to recover full 3D shape. A sketch of our hardware is shown in Figure 3 of this website.
How does our technique compare with existing 3D scanning systems?
Polarized 3D achieves increased depth precision wrt. other modalities of similar optical complexity (e.g. Kinect/Realsense/Mesa/PMD/etc). It compares favorably with even multithousand dollar, industrial-grade laser scanners (Figure 2 of this website).
How does polarization compare to shape-from-shading or photometric-stereo?
Shape-from-shading and photometric-stereo employ restrictive assumptions, like Lambertian surfaces or distant or controlled lighting. As shown in Figure 4 of the webpage, for complex scenes (like Figure 5 of the webpage), the polarization information may generalize better.
Does Polarized 3D operate in real-time?
Although the acquisition can be made real-time (with a polarization mosaic), the computation is not yet real-time, requiring minutes to render 1 depth frame. We are exploring faster algorithms and GPU implementations to eventually arrive at 30 Hz framerates.
When will the software be available?
We provide any form of software and hardware support as a courtesy to the research community. Before May 30 2016, I plan to upload a dataset as well as executable MATLAB code. Note that this is before any of our IP is patented, so we are doing this at a risk to our commercial prospects (the second author is in the process of startup). Once the second author has commerical things settled, we do have plans to upload a slick DIY -- our intention is to ue this as a class project in the MIT course we teach.
What are some consumer applications of this work?
This is a new tool for 3D sensing and finds use in virtual reality, autonomous navigation, and industrial inspection. In particular, the proposed technique is suited in areas that require precise 3D depth (e.g. 3D scanning).
Figure 2. Polarization-enhanced depth is, in some cases, a better option to industrial solutions, like this raster-based multistripe laser scanner (NextEngine 3D). As shown in the figure, features down to 300 microns can be captured with Polarized 3D.
Figure 3: (a) The hardware prototype is low-cost---to reproduce, one needs a depth sensor, polarizer, and DSLR. (b) Following from Malus's Law, the DSLR intensity should vary sinusoidally with polarization angle.
Figure 4: For complex materials and lighting, shading and photometric approaches require many more images.
Figure 5: Complex objects, with mixed materials like this shiny face can be scanned. See Figure 9 of the paper for the polarization result.