Michelson Holography | OSA Optica 2021

Suyeon Choi, Jonghyun Kim, Yifan (Evan) Peng, Gordon Wetzstein

A holographic display technology that optimizes image quality for emerging holographic near-eye displays using 2 SLMs and camera-in-the-loop calibration.

 

ABSTRACT

We introduce Michelson holography (MH), a holographic display technology that optimizes image quality for emerging holographic near-eye displays. Using two spatial light modulators (SLMs), MH is capable of leveraging destructive interference to optically cancel out undiffracted light corrupting the observed image. We calibrate this system using emerging camera-in-the-loop holography techniques and demonstrate state-of-the-art 2D and multi-plane holographic image quality.

FILES

    • Technical Paper and Supplement (link)

CITATION

S. Choi, J. Kim, Y. Peng, G. Wetzstein, Optimizing image quality for holographic near-eye displays with Michelson Holography, OSA Optica, 2021.

 

BibTeX

@article{Choi:2021:MichelsonHolography,
author = {S. Choi and J. Kim and Y. Peng and G. Wetzstein},
title = {{Optimizing Image Quality for Holographic Near-eye Displays with Michelson Holography}},
journal = {OSA Optica},
year = {2021},
}

 

Holographic display setup schematic. Our display uses a fiber-coupled RGB laser module, collimating optics,  two liquid crystal on silicon (LCoS) spatial light modulators, and a machine vision camera.

 

Press Coverage

    • OSA Optica News (link)
    • Communications of ACM TechNews (link)
    • Physics.org News (link)
    • Sci Tech Daily (link)
    • Photonics Media (link)
    • Brinkwire Technology (link)
    • EurekAlert News (link)
    • Knowledia News (link)

Overview of results

Experimental 2D results comparing the conventional 1 SLM and Michelson Holography’s 2 SLMs configuration.
Experimental 3D results of multiplane display mode generated with Michelson Holography and captured with our holographic near-eye display prototype.
Convergence for a single color channel showing the captured holographic image (top) and the corresponding phase patterns that are shown on the two spatial light modulators as the algorithm converges.

Related Projects

You may also be interested in related projects from our group on holographic near-eye displays:

  • S. Choi et al. “Neural 3D Holography: Learning Accurate Wave Propagation Models for 3D Holographic Virtual and Augmented Reality Displays”, ACM SIGGRAPH Asia 2021 (link)
  • Y. Peng et al. “Neural Holography”, ACM SIGGRAPH Asia 2020 (link)
  • N. Padmanaban et al. “Holographic Near-Eye Displays Based on Overlap-Add Stereograms”, ACM SIGGRAPH Asia 2019 (link)

and other next-generation near-eye display and wearable technology:

  • R. Konrad et al. “Gaze-contingent Ocular Parallax Rendering for Virtual Reality”, ACM Transactions on Graphics 2020 (link)
  • B. Krajancich et al. “Optimizing Depth Perception in Virtual and Augmented Reality through Gaze-contingent Stereo Rendering”, ACM SIGGRAPH Asia 2020 (link)
  • B. Krajancich et al. “Factored Occlusion: Single Spatial Light Modulator Occlusion-capable Optical See-through Augmented Reality Display”, IEEE TVCG, 2020 (link)
  • N. Padmanaban et al. “Autofocals: Evaluating Gaze-Contingent Eyeglasses for Presbyopes”, Science Advances 2019 (link)
  • K. Rathinavel et al. “Varifocal Occlusion-Capable Optical See-through Augmented Reality Display based on Focus-tunable Optics”, IEEE TVCG 2019 (link)
  • N. Padmanaban et al. “Optimizing virtual reality for all users through gaze-contingent and adaptive focus displays”, PNAS 2017 (link)
  • R. Konrad et al. “Accommodation-invariant Computational Near-eye Displays”, ACM SIGGRAPH 2017 (link)

Acknowledgements

Suyeon Choi was supported by a Kwanjeong Scholarship and a Korea Government Scholarship. This project was further supported by Ford, NSF (awards 1553333 and 1839974), a Sloan Fellowship, an Okawa Research Grant, and a PECASE by the ARO. We thank Ward Lopes, Morgan McGuire, and David Luebke for advice.