HDI4D 1yr Tommy

Progress on goal 1.2.1 (Interactive image based lighting (IBL) using high dynamic range (HDR) radiance map)

We have developed a system for real-time IBL using HDR radiance maps, and implemented it in the project platform Unity3D. The system that has been developed is easy to understand and implement, as well as being highly performant. Not only does it support HDR radiance maps, but we have extended it to apply lighting from existing low dynamic range (LDR) 360-degree panoramic video, such as that found on YouTube ( Lighting is calculated solely from the video input, with no prior analysis of the video required. This allows us to realistically render virtual objects according to the exact scene given in real-time, even if the scene is changing, and even if a HDR capture of the scene is unavailable.

Real-Time IBL System Pipeline

The system works by first converting a single frame of the input video from LDR to HDR, before analyzing the HDR frame in real-time, and using this analysis to believably render virtual objects as if lit by the input scene. A standard view of the input video is drawn, and the virtual objects are overlaid onto it to provide the final output. For stereo output to HMD the rendering process for the virtual objects is performed twice, resulting in a separate view for each eye, and a perceptable distance from the viewer to the virtual objects.

Our system was tested on both legacy (Nvidia Geforce 690) and state-of-the-art (Nvidia Geforce 980) graphics cards, and both were able to render to the Oculus DK2 head-mounted display at more than 75Hz, significantly exceeding the original goal of 30Hz. Results using LDR environment maps (such as those obtained from standard 360-degree video) were not expected to be as high quality as those using specially-captured HDR environment maps, however using our developed system we found the rendering quality to be believable and realistic even for LDR video.

Stereo Output for Underwater Video Including Virtual Teapots