x-hour Outdoor Photometric Stereo

While Photometric Stereo (PS) has long been confined to the lab, there has been a recent interest in applying this technique to reconstruct outdoor objects and scenes. Unfortunately, the most successful outdoor PS techniques typically require gathering either months of data, or waiting for a particular time of the year. In this paper, we analyze the illumination requirements for single-day outdoor PS to yield stable normal reconstructions, and determine that these requirements are often available in much less than a full day. In particular, we show that the right set of conditions for stable PS solutions may be observed in the sky within short time intervals of just above one hour. This work provides, for the first time, a detailed analysis of the factors affecting the performance of x-hour outdoor photometric stereo.

Paper

Yannick Hold-Geoffroy, Jinsong Zhang, Paulo F. U. Gotardo, and Jean-François Lalonde
x-hour Outdoor Photometric Stereo
International Conference on 3D Vision (3DV), 2015.
[PDF pre-print, 10.8MB] [BibTeX]
Best paper award (Runner up)!

     

Supplementary material

We provide additional results in this supplementary document.

Poster

You can download the poster presented at 3DV'15 and ICCP'16 in PDF format.

Talk

You can download the slides in Keynote, PPT, and PDF formats. Please cite the source if you use these slides in a presentation.

Code

Download the code in a ZIP file, or get it from github. For more information, please see this README file. Please cite this paper if you use the code for your publications!

Data

We provide some of the HDR sky maps that were used in this paper. Please see this website.

In addition, we also provide the "Day of the Owl (DotO)" data, which was used to generate fig. 9 from the paper. The *_sphere.exr is a chrome sphere used to align the environment map to the object reference coordinates, which can be done with the envmapConversions toolbox.

Download Owlie's 3D model (matlab) and normal map:

Video

Acknowledgements

The authors gratefully acknowledge the following funding sources:

  • NSERC Discovery GRANT RGPIN-2014-05314
  • FRQ-NT New Researcher Grant 2016‐NC‐189939
  • REPARTI Strategic Network
  • New faculty startup grant from the ECE department at Laval University
  • NVIDIA Corporation with the donation of the Tesla K40 GPU used for this research.

Ulaval logo
DR logo