Learning High Dynamic Range from Outdoor Panoramas

Outdoor lighting has extremely high dynamic range. This makes the process of capturing outdoor environment maps notoriously challenging since special equipment must be used. In this work, we propose an alternative approach. We first capture lighting with a regular, LDR omnidirectional camera, and aim to recover the HDR after the fact via a novel, learning-based inverse tonemapping method. We propose a deep autoencoder framework which regresses linear, high dynamic range data from non-linear, saturated, low dynamic range panoramas. We validate our method through a wide set of experiments on synthetic data, as well as on a novel dataset of real photographs with ground truth. Our approach finds applications in a variety of settings, ranging from outdoor light capture to image matching.


Jinsong Zhang and Jean-François Lalonde
Learning High Dynamic Range from Outdoor Panoramas
International Conference on Computer Vision, 2017
[arXiv:1703.10200 pre-print] [BibTeX]

Supplementary material

We provide additional results in this supplementary page.


You can download the slides for the ICCV'17 oral presentation in PDF format.

In case you missed the fun in Venice, here is the talk as delivered by Jinsong Zhang!


You can download the poster presented at ICCV'17 in PDF format.


The dataset used in this paper is available here.


The authors gratefully acknowledge the following funding sources:

  • FRQ-NT New Researcher Grant 2016NC189939
  • NSERC Discovery Grant RGPIN-2014-05314
  • NVIDIA Corporation with the donation of the Tesla K40 and Titan X GPUs used for this research.
  • REPARTI Strategic Network

Ulaval logo