Time-multiplexed Neural Holography | SIGGRAPH 2022

Suyeon Choi*, Manu Gopakumar*, Yifan (Evan) Peng, Jonghyun Kim, Matthew O'Toole, Gordon Wetzstein

A flexible framework for holographic near-eye displays with fast heavily-quantized spatial light modulators.

SIGGRAPH 2022 - 6 MIN TECH TALK

ABSTRACT

Holographic near-eye displays offer unprecedented capabilities for virtual and augmented reality systems, including perceptually important focus cues. Although artificial intelligence–driven algorithms for computer-generated holography (CGH) have recently made much progress in improving the image quality and synthesis efficiency of holograms, these algorithms are not directly applicable to emerging phase-only spatial light modulators (SLM) that are extremely fast but offer phase control with very limited precision. The speed of these SLMs offers time multiplexing capabilities, essentially enabling partially-coherent holographic display modes. Here we report advances in camera-calibrated wave propagation models for these types of near-eye holographic displays and we develop a CGH framework that robustly optimizes the heavily quantized phase patterns of fast SLMs. Our framework is flexible in supporting runtime supervision with different types of content, including 2D and 2.5D RGBD images, 3D focal stacks, and 4D light fields. Using our framework, we demonstrate state-of-the-art results for all of these scenarios in simulation and experiment.

Unlike conventional displays which directly present desired intensities, Holographic displays use exotic phase patterns for an SLM which modulates the “phase” of light per pixel. The modulated wavefield propagates and reconstructs a 3D scene in a volume shown in the second row.

FILES

CITATION

S. Choi*, M. Gopakumar*, Y. Peng, J. Kim, M. O’Toole, and G. Wetzstein, “Time-multiplexed Neural Holography: A flexible framework for holographic near-eye displays with fast heavily-quantized spatial light modulators”, SIGGRAPH 2022.

@inproceedings{choi2022time,
title={Time-multiplexed Neural Holography: A flexible framework for holographic near-eye displays with fast heavily-quantized spatial light modulators},
author={Choi, Suyeon and Gopakumar, Manu and Peng, Yifan and Kim, Jonghyun and O’Toole, Matthew and Wetzstein, Gordon},
booktitle={Proceedings of the ACM SIGGRAPH},
pages={1–9},
year={2022}
}
 
 

Illustration of our framework. The complex-valued field at the SLM is adjusted by several learnable terms (including discrete lookup table) and then processed by a CNN. The resulting complex-valued wave field is propagated to all target planes using the ASM wave propagation operator with amplitude and phase at the Fourier domain. The wave fields at each target plane are processed again by smaller CNNs. The proposed framework applies to multiple input forms, including 2D, 2.5D, 3D, and 4D.
Display prototype. Photograph of the prototype holographic near-eye display systems.

 
 

The following three results are experimentally captured with our holographic display prototype changing the focus of a camera. Our framework produces high-quality, per-pixel depth cues simultaneously with the shallow depth of field and natural blur behavior.

Additional 3D experimental results (Big Buck Bunny)

Additional 3D experimental results (SINTEL)

4D Light Field supervision

Our method is flexible enough to use other types of input, such as 4D light field. Since our framework directly supervise the full light field, is the only method that can reconstruct full, high-quality light field to date. Other methods either fail to reconstruct the high quality since not optimized or as shown below, any method that uses smooth target phases fail to reconstruct the full light field as it significantly sacrifices its Etendue. Below, we show the reconstructed light field and focal stack. It also demonstrates the shallow depth of field blur according to the parallax information given by the light field input.

*The first row is representative of all holographic methods that use smooth target phases (e.g. OLAS (Padmanaban et al. 2019), Shi et al. 2021, …). The reduced Etendue produced by the smooth target phase assumption results in an eye box size even smaller than what an SLM can support while ours does not.

 

Additional 2D experimental results

Related Projects

You may also be interested in related projects from our group on holographic near-eye displays:

  • J. Kim et al. “Holographic Glasses”, SIGGRAPH, 2022 (link)
  • M. Gopakumar et al. “Unfiltered Holography”, Optics Letters, 2021 (link)
  • S. Choi et al. “Neural 3D Holography”, ACM SIGGRAPH Asia, 2021 (link)
  • S. Choi et al. “Michelson Holography”, Optica, 2021 (link)
  • Y. Peng et al. “Neural Holography”, ACM SIGGRAPH Asia 2020 (link)
  • N. Padmanaban et al. “Holographic Near-Eye Displays Based on Overlap-Add Stereograms”, ACM SIGGRAPH Asia 2019 (link)

 

Acknowledgements

We thank Cindy Nguyen for helpful discussions. This project was in part supported by a Kwanjeong Scholarship, a Stanford SGF, Intel, NSF (award 1839974), a PECASE by the ARO, and Sony.